Sample records for real-time model checking

  1. The Priority Inversion Problem and Real-Time Symbolic Model Checking

    DTIC Science & Technology

    1993-04-23

    real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties

  2. Temporal Specification and Verification of Real-Time Systems.

    DTIC Science & Technology

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  3. Compositional schedulability analysis of real-time actor-based systems.

    PubMed

    Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan

    2017-01-01

    We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.

  4. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  5. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  6. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  7. Model Checking the Remote Agent Planner

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This work tackles the problem of using Model Checking for the purpose of verifying the HSTS (Scheduling Testbed System) planning system. HSTS is the planner and scheduler of the remote agent autonomous control system deployed in Deep Space One (DS1). Model Checking allows for the verification of domain models as well as planning entries. We have chosen the real-time model checker UPPAAL for this work. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a sketch for the mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify.

  8. RealSurf - A Tool for the Interactive Visualization of Mathematical Models

    NASA Astrophysics Data System (ADS)

    Stussak, Christian; Schenzel, Peter

    For applications in fine art, architecture and engineering it is often important to visualize and to explore complex mathematical models. In former times there were static models of them collected in museums respectively in mathematical institutes. In order to check their properties for esthetical reasons it could be helpful to explore them interactively in 3D in real time. For the class of implicitly given algebraic surfaces we developed the tool RealSurf. Here we give an introduction to the program and some hints for the design of interesting surfaces.

  9. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  10. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  11. Analyzing Phylogenetic Trees with Timed and Probabilistic Model Checking: The Lactose Persistence Case Study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-12-01

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  12. Analyzing phylogenetic trees with timed and probabilistic model checking: the lactose persistence case study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-10-23

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  13. Logic Model Checking of Time-Periodic Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Florian, Mihai; Gamble, Ed; Holzmann, Gerard

    2012-01-01

    In this paper we report on the work we performed to extend the logic model checker SPIN with built-in support for the verification of periodic, real-time embedded software systems, as commonly used in aircraft, automobiles, and spacecraft. We first extended the SPIN verification algorithms to model priority based scheduling policies. Next, we added a library to support the modeling of periodic tasks. This library was used in a recent application of the SPIN model checker to verify the engine control software of an automobile, to study the feasibility of software triggers for unintended acceleration events.

  14. A Practical Approach to Implementing Real-Time Semantics

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Bhat, Girish; Cleaveland, Rance

    1999-01-01

    This paper investigates implementations of process algebras which are suitable for modeling concurrent real-time systems. It suggests an approach for efficiently implementing real-time semantics using dynamic priorities. For this purpose a proces algebra with dynamic priority is defined, whose semantics corresponds one-to-one to traditional real-time semantics. The advantage of the dynamic-priority approach is that it drastically reduces the state-space sizes of the systems in question while preserving all properties of their functional and real-time behavior. The utility of the technique is demonstrated by a case study which deals with the formal modeling and verification of the SCSI-2 bus-protocol. The case study is carried out in the Concurrency Workbench of North Carolina, an automated verification tool in which the process algebra with dynamic priority is implemented. It turns out that the state space of the bus-protocol model is about an order of magnitude smaller than the one resulting from real-time semantics. The accuracy of the model is proved by applying model checking for verifying several mandatory properties of the bus protocol.

  15. Real-time simulation model of the HL-20 lifting body

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Cruz, Christopher I.; Ragsdale, W. A.

    1992-01-01

    A proposed manned spacecraft design, designated the HL-20, has been under investigation at Langley Research Center. Included in that investigation are flight control design and flying qualities studies utilizing a man-in-the-loop real-time simulator. This report documents the current real-time simulation model of the HL-20 lifting body vehicle, known as version 2.0, presently in use at NASA Langley Research Center. Included are data on vehicle aerodynamics, inertias, geometries, guidance and control laws, and cockpit displays and controllers. In addition, trim case and dynamic check case data is provided. The intent of this document is to provide the reader with sufficient information to develop and validate an equivalent simulation of the HL-20 for use in real-time or analytical studies.

  16. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking

    PubMed Central

    Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems. PMID:27187178

  17. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    PubMed

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.

  18. Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun; Niu, Hongli

    2015-10-01

    Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.

  19. Verification and Planning Based on Coinductive Logic Programming

    NASA Technical Reports Server (NTRS)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.

  20. Model Checking Real Time Java Using Java PathFinder

    NASA Technical Reports Server (NTRS)

    Lindstrom, Gary; Mehlitz, Peter C.; Visser, Willem

    2005-01-01

    The Real Time Specification for Java (RTSJ) is an augmentation of Java for real time applications of various degrees of hardness. The central features of RTSJ are real time threads; user defined schedulers; asynchronous events, handlers, and control transfers; a priority inheritance based default scheduler; non-heap memory areas such as immortal and scoped, and non-heap real time threads whose execution is not impeded by garbage collection. The Robust Software Systems group at NASA Ames Research Center has JAVA PATHFINDER (JPF) under development, a Java model checker. JPF at its core is a state exploring JVM which can examine alternative paths in a Java program (e.g., via backtracking) by trying all nondeterministic choices, including thread scheduling order. This paper describes our implementation of an RTSJ profile (subset) in JPF, including requirements, design decisions, and current implementation status. Two examples are analyzed: jobs on a multiprogramming operating system, and a complex resource contention example involving autonomous vehicles crossing an intersection. The utility of JPF in finding logic and timing errors is illustrated, and the remaining challenges in supporting all of RTSJ are assessed.

  1. The NASA Lewis integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1991-01-01

    A new flight simulation facility has been developed at NASA Lewis to allow integrated propulsion-control and flight-control algorithm development and evaluation in real time. As a preliminary check of the simulator facility and the correct integration of its components, the control design and physics models for an STOVL fighter aircraft model have been demonstrated, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The results show that this fixed-based flight simulator can provide real-time feedback and display of both airframe and propulsion variables for validation of integrated systems and testing of control design methodologies and cockpit mechanizations.

  2. The NASA Lewis integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1991-01-01

    A new flight simulation facility was developed at NASA-Lewis. The purpose of this flight simulator is to allow integrated propulsion control and flight control algorithm development and evaluation in real time. As a preliminary check of the simulator facility capabilities and correct integration of its components, the control design and physics models for a short take-off and vertical landing fighter aircraft model were shown, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The initial testing and evaluation results show that this fixed based flight simulator can provide real time feedback and display of both airframe and propulsion variables for validation of integrated flight and propulsion control systems. Additionally, through the use of this flight simulator, various control design methodologies and cockpit mechanizations can be tested and evaluated in a real time environment.

  3. Real-Time System Verification by Kappa-Induction

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.

    2005-01-01

    We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.

  4. Action-based verification of RTCP-nets with CADP

    NASA Astrophysics Data System (ADS)

    Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin

    2015-12-01

    The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.

  5. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  6. Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking

    NASA Technical Reports Server (NTRS)

    Cavada, Roberto; Pecheur, Charles

    2003-01-01

    This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.

  7. Addressing Dynamic Issues of Program Model Checking

    NASA Technical Reports Server (NTRS)

    Lerda, Flavio; Visser, Willem

    2001-01-01

    Model checking real programs has recently become an active research area. Programs however exhibit two characteristics that make model checking difficult: the complexity of their state and the dynamic nature of many programs. Here we address both these issues within the context of the Java PathFinder (JPF) model checker. Firstly, we will show how the state of a Java program can be encoded efficiently and how this encoding can be exploited to improve model checking. Next we show how to use symmetry reductions to alleviate some of the problems introduced by the dynamic nature of Java programs. Lastly, we show how distributed model checking of a dynamic program can be achieved, and furthermore, how dynamic partitions of the state space can improve model checking. We support all our findings with results from applying these techniques within the JPF model checker.

  8. Rewriting Modulo SMT

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.

    2013-01-01

    Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.

  9. Timing analysis by model checking

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  10. Usage Automata

    NASA Astrophysics Data System (ADS)

    Bartoletti, Massimo

    Usage automata are an extension of finite stata automata, with some additional features (e.g. parameters and guards) that improve their expressivity. Usage automata are expressive enough to model security requirements of real-world applications; at the same time, they are simple enough to be statically amenable, e.g. they can be model-checked against abstractions of program usages. We study here some foundational aspects of usage automata. In particular, we discuss about their expressive power, and about their effective use in run-time mechanisms for enforcing usage policies.

  11. Application of conditional moment tests to model checking for generalized linear models.

    PubMed

    Pan, Wei

    2002-06-01

    Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.

  12. State-based verification of RTCP-nets with nuXmv

    NASA Astrophysics Data System (ADS)

    Biernacka, Agnieszka; Biernacki, Jerzy; Szpyrka, Marcin

    2015-12-01

    The paper deals with an algorithm of translation of RTCP-nets' (real-time coloured Petri nets) coverability graphs into nuXmv state machines. The approach enables users to verify RTCP-nets with model checking techniques provided by the nuXmv tool. Full details of the algorithm are presented and an illustrative example of the approach usefulness is provided.

  13. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  14. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  15. Power quality analysis of DC arc furnace operation using the Bowman model for electric arc

    NASA Astrophysics Data System (ADS)

    Gherman, P. L.

    2018-01-01

    This work is about a relatively new domain. The DC electric arc is superior to the AC electric arc and it’s not used in Romania. This is why we analyzed the work functions of these furnaces by simulation and model checking of the simulation results.The conclusions are favorable, to be carried is to develop a real-time control system of steel elaboration process.

  16. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  17. A real-time digital computer program for the simulation of a single rotor helicopter

    NASA Technical Reports Server (NTRS)

    Houck, J. A.; Gibson, L. H.; Steinmetz, G. G.

    1974-01-01

    A computer program was developed for the study of a single-rotor helicopter on the Langley Research Center real-time digital simulation system. Descriptions of helicopter equations and data, program subroutines (including flow charts and listings), real-time simulation system routines, and program operation are included. Program usage is illustrated by standard check cases and a representative flight case.

  18. Real-time spectral analysis of HRV signals: an interactive and user-friendly PC system.

    PubMed

    Basano, L; Canepa, F; Ottonello, P

    1998-01-01

    We present a real-time system, built around a PC and a low-cost data acquisition board, for the spectral analysis of the heart rate variability signal. The Windows-like operating environment on which it is based makes the computer program very user-friendly even for non-specialized personnel. The Power Spectral Density is computed through the use of a hybrid method, in which a classical FFT analysis follows an autoregressive finite-extension of data; the stationarity of the sequence is continuously checked. The use of this algorithm gives a high degree of robustness of the spectral estimation. Moreover, always in real time, the FFT of every data block is computed and displayed in order to corroborate the results as well as to allow the user to interactively choose a proper AR model order.

  19. Arbitrary-order corrections for finite-time drift and diffusion coefficients

    NASA Astrophysics Data System (ADS)

    Anteneodo, C.; Riera, R.

    2009-09-01

    We address a standard class of diffusion processes with linear drift and quadratic diffusion coefficients. These contributions to dynamic equations can be directly drawn from data time series. However, real data are constrained to finite sampling rates and therefore it is crucial to establish a suitable mathematical description of the required finite-time corrections. Based on Itô-Taylor expansions, we present the exact corrections to the finite-time drift and diffusion coefficients. These results allow to reconstruct the real hidden coefficients from the empirical estimates. We also derive higher-order finite-time expressions for the third and fourth conditional moments that furnish extra theoretical checks for this class of diffusion models. The analytical predictions are compared with the numerical outcomes of representative artificial time series.

  20. Venture Evaluation and Review Technique (VERT). Users’/Analysts’ Manual

    DTIC Science & Technology

    1979-10-01

    real world. Additionally, activity pro- cessing times could be entered as a normal, uniform or triangular distribution. Activity times can also be...work or tasks, or if the unit activities are such abstractions of the real world that the estimation of the time , cost and performance parameters for...utilized in that con- straining capacity. 7444 The network being processed has passed all the previous error checks. It currently has a real time

  1. Efficient Craig Interpolation for Linear Diophantine (Dis)Equations and Linear Modular Equations

    DTIC Science & Technology

    2008-02-01

    Craig interpolants has enabled the development of powerful hardware and software model checking techniques. Efficient algorithms are known for computing...interpolants in rational and real linear arithmetic. We focus on subsets of integer linear arithmetic. Our main results are polynomial time algorithms ...congruences), and linear diophantine disequations. We show the utility of the proposed interpolation algorithms for discovering modular/divisibility predicates

  2. Automated System Checkout to Support Predictive Maintenance for the Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, Ann; Deb, Somnath; Kulkarni, Deepak; Wang, Yao; Lau, Sonie (Technical Monitor)

    1998-01-01

    The Propulsion Checkout and Control System (PCCS) is a predictive maintenance software system. The real-time checkout procedures and diagnostics are designed to detect components that need maintenance based on their condition, rather than using more conventional approaches such as scheduled or reliability centered maintenance. Predictive maintenance can reduce turn-around time and cost and increase safety as compared to conventional maintenance approaches. Real-time sensor validation, limit checking, statistical anomaly detection, and failure prediction based on simulation models are employed. Multi-signal models, useful for testability analysis during system design, are used during the operational phase to detect and isolate degraded or failed components. The TEAMS-RT real-time diagnostic engine was developed to utilize the multi-signal models by Qualtech Systems, Inc. Capability of predicting the maintenance condition was successfully demonstrated with a variety of data, from simulation to actual operation on the Integrated Propulsion Technology Demonstrator (IPTD) at Marshall Space Flight Center (MSFC). Playback of IPTD valve actuations for feature recognition updates identified an otherwise undetectable Main Propulsion System 12 inch prevalve degradation. The algorithms were loaded into the Propulsion Checkout and Control System for further development and are the first known application of predictive Integrated Vehicle Health Management to an operational cryogenic testbed. The software performed successfully in real-time, meeting the required performance goal of 1 second cycle time.

  3. Visual Predictive Check in Models with Time-Varying Input Function.

    PubMed

    Largajolli, Anna; Bertoldo, Alessandra; Campioni, Marco; Cobelli, Claudio

    2015-11-01

    The nonlinear mixed effects models are commonly used modeling techniques in the pharmaceutical research as they enable the characterization of the individual profiles together with the population to which the individuals belong. To ensure a correct use of them is fundamental to provide powerful diagnostic tools that are able to evaluate the predictive performance of the models. The visual predictive check (VPC) is a commonly used tool that helps the user to check by visual inspection if the model is able to reproduce the variability and the main trend of the observed data. However, the simulation from the model is not always trivial, for example, when using models with time-varying input function (IF). In this class of models, there is a potential mismatch between each set of simulated parameters and the associated individual IF which can cause an incorrect profile simulation. We introduce a refinement of the VPC by taking in consideration a correlation term (the Mahalanobis or normalized Euclidean distance) that helps the association of the correct IF with the individual set of simulated parameters. We investigate and compare its performance with the standard VPC in models of the glucose and insulin system applied on real and simulated data and in a simulated pharmacokinetic/pharmacodynamic (PK/PD) example. The newly proposed VPC performance appears to be better with respect to the standard VPC especially for the models with big variability in the IF where the probability of simulating incorrect profiles is higher.

  4. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  5. The design and implementation of postprocessing for depth map on real-time extraction system.

    PubMed

    Tang, Zhiwei; Li, Bin; Li, Huosheng; Xu, Zheng

    2014-01-01

    Depth estimation becomes the key technology to resolve the communications of the stereo vision. We can get the real-time depth map based on hardware, which cannot implement complicated algorithm as software, because there are some restrictions in the hardware structure. Eventually, some wrong stereo matching will inevitably exist in the process of depth estimation by hardware, such as FPGA. In order to solve the problem a postprocessing function is designed in this paper. After matching cost unique test, the both left-right and right-left consistency check solutions are implemented, respectively; then, the cavities in depth maps can be filled by right depth values on the basis of right-left consistency check solution. The results in the experiments have shown that the depth map extraction and postprocessing function can be implemented in real time in the same system; what is more, the quality of the depth maps is satisfactory.

  6. "Internet of Things" Real-Time Free Flap Monitoring.

    PubMed

    Kim, Sang Hun; Shin, Ho Seong; Lee, Sang Hwan

    2018-01-01

    Free flaps are a common treatment option for head and neck reconstruction in plastic reconstructive surgery, and monitoring of the free flap is the most important factor for flap survival. In this study, the authors performed real-time free flap monitoring based on an implanted Doppler system and "internet of things" (IoT)/wireless Wi-Fi, which is a convenient, accurate, and efficient approach for surgeons to monitor a free flap. Implanted Doppler signals were checked continuously until the patient was discharged by the surgeon and residents using their own cellular phone or personal computer. If the surgeon decided that a revision procedure or exploration was required, the authors checked the consumed time (positive signal-to-operating room time) from the first notification when the flap's status was questioned to the determination for revision surgery according to a chart review. To compare the efficacy of real-time monitoring, the authors paired the same number of free flaps performed by the same surgeon and monitored the flaps using conventional methods such as a physical examination. The total survival rate was greater in the real-time monitoring group (94.7% versus 89.5%). The average time for the real-time monitoring group was shorter than that for the conventional group (65 minutes versus 86 minutes). Based on this study, real-time free flap monitoring using IoT technology is a method that surgeon and reconstruction team can monitor simultaneously at any time in any situation.

  7. Model building strategy for logistic regression: purposeful selection.

    PubMed

    Zhang, Zhongheng

    2016-03-01

    Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.

  8. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  9. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Lambotte, S.; Engels, F.

    2014-12-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheu, R; Ghafar, R; Powers, A

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determinedmore » by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.« less

  11. User's guide for MAGIC-Meteorologic and hydrologic genscn (generate scenarios) input converter

    USGS Publications Warehouse

    Ortel, Terry W.; Martin, Angel

    2010-01-01

    Meteorologic and hydrologic data used in watershed modeling studies are collected by various agencies and organizations, and stored in various formats. Data may be in a raw, un-processed format with little or no quality control, or may be checked for validity before being made available. Flood-simulation systems require data in near real-time so that adequate flood warnings can be made. Additionally, forecasted data are needed to operate flood-control structures to potentially mitigate flood damages. Because real-time data are of a provisional nature, missing data may need to be estimated for use in floodsimulation systems. The Meteorologic and Hydrologic GenScn (Generate Scenarios) Input Converter (MAGIC) can be used to convert data from selected formats into the Hydrologic Simulation System-Fortran hourly-observations format for input to a Watershed Data Management database, for use in hydrologic modeling studies. MAGIC also can reformat the data to the Full Equations model time-series format, for use in hydraulic modeling studies. Examples of the application of MAGIC for use in the flood-simulation system for Salt Creek in northeastern Illinois are presented in this report.

  12. 77 FR 21494 - Definition of “Predominantly Engaged in Financial Activities”

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-10

    ... financial activities that were authorized by the Board under various authorities at different points in time... activities that are closely related to banking.\\23\\ These activities include performing appraisals of real... industrial real estate financing, providing check guarantee services, providing collection agency services...

  13. The DaveMLTranslator: An Interface for DAVE-ML Aerodynamic Models

    NASA Technical Reports Server (NTRS)

    Hill, Melissa A.; Jackson, E. Bruce

    2007-01-01

    It can take weeks or months to incorporate a new aerodynamic model into a vehicle simulation and validate the performance of the model. The Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML) has been proposed as a means to reduce the time required to accomplish this task by defining a standard format for typical components of a flight dynamic model. The purpose of this paper is to describe an object-oriented C++ implementation of a class that interfaces a vehicle subsystem model specified in DAVE-ML and a vehicle simulation. Using the DaveMLTranslator class, aerodynamic or other subsystem models can be automatically imported and verified at run-time, significantly reducing the elapsed time between receipt of a DAVE-ML model and its integration into a simulation environment. The translator performs variable initializations, data table lookups, and mathematical calculations for the aerodynamic build-up, and executes any embedded static check-cases for verification. The implementation is efficient, enabling real-time execution. Simple interface code for the model inputs and outputs is the only requirement to integrate the DaveMLTranslator as a vehicle aerodynamic model. The translator makes use of existing table-lookup utilities from the Langley Standard Real-Time Simulation in C++ (LaSRS++). The design and operation of the translator class is described and comparisons with existing, conventional, C++ aerodynamic models of the same vehicle are given.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaks, D; Fletcher, R; Salamon, S

    Purpose: To develop an online framework that tracks a patient’s plan from initial simulation to treatment and that helps automate elements of the physics plan checks usually performed in the record and verify (RV) system and treatment planning system. Methods: We have developed PlanTracker, an online plan tracking system that automatically imports new patients tasks and follows it through treatment planning, physics checks, therapy check, and chart rounds. A survey was designed to collect information about the amount of time spent by medical physicists in non-physics related tasks. We then assessed these non-physics tasks for automation. Using these surveys, wemore » directed our PlanTracker software development towards the automation of intra-plan physics review. We then conducted a systematic evaluation of PlanTracker’s accuracy by generating test plans in the RV system software designed to mimic real plans, in order to test its efficacy in catching errors both real and theoretical. Results: PlanTracker has proven to be an effective improvement to the clinical workflow in a radiotherapy clinic. We present data indicating that roughly 1/3 of the physics plan check can be automated, and the workflow optimized, and show the functionality of PlanTracker. When the full system is in clinical use we will present data on improvement of time use in comparison to survey data prior to PlanTracker implementation. Conclusion: We have developed a framework for plan tracking and automatic checks in radiation therapy. We anticipate using PlanTracker as a basis for further development in clinical/research software. We hope that by eliminating the most simple and time consuming checks, medical physicists may be able to spend their time on plan quality and other physics tasks rather than in arithmetic and logic checks. We see this development as part of a broader initiative to advance the clinical/research informatics infrastructure surrounding the radiotherapy clinic. This research project has been financially supported by Varian Medical Systems, Palo Alto, CA, through a Varian MRA.« less

  15. Optimization of the resources management in fighting wildfires.

    PubMed

    Martin-Fernández, Susana; Martínez-Falero, Eugenio; Pérez-González, J Manuel

    2002-09-01

    Wildfires lead to important economic, social, and environmental losses, especially in areas of Mediterranean climate where they are of a high intensity and frequency. Over the past 30 years there has been a dramatic surge in the development and use of fire spread models. However, given the chaotic nature of environmental systems, it is very difficult to develop real-time fire-extinguishing models. This article proposes a method of optimizing the performance of wildfire fighting resources such that losses are kept to a minimum. The optimization procedure includes discrete simulation algorithms and Bayesian optimization methods for discrete and continuous problems (simulated annealing and Bayesian global optimization). Fast calculus algorithms are applied to provide optimization outcomes in short periods of time such that the predictions of the model and the real behavior of the fire, combat resources, and meteorological conditions are similar. In addition, adaptive algorithms take into account the chaotic behavior of wildfire so that the system can be updated with data corresponding to the real situation to obtain a new optimum solution. The application of this method to the Northwest Forest of Madrid (Spain) is also described. This application allowed us to check that it is a helpful tool in the decision-making process.

  16. Optimization of the Resources Management in Fighting Wildfires

    NASA Astrophysics Data System (ADS)

    Martin-Fernández, Susana; Martínez-Falero, Eugenio; Pérez-González, J. Manuel

    2002-09-01

    Wildfires lead to important economic, social, and environmental losses, especially in areas of Mediterranean climate where they are of a high intensity and frequency. Over the past 30 years there has been a dramatic surge in the development and use of fire spread models. However, given the chaotic nature of environmental systems, it is very difficult to develop real-time fire-extinguishing models. This article proposes a method of optimizing the performance of wildfire fighting resources such that losses are kept to a minimum. The optimization procedure includes discrete simulation algorithms and Bayesian optimization methods for discrete and continuous problems (simulated annealing and Bayesian global optimization). Fast calculus algorithms are applied to provide optimization outcomes in short periods of time such that the predictions of the model and the real behavior of the fire, combat resources, and meteorological conditions are similar. In addition, adaptive algorithms take into account the chaotic behavior of wildfire so that the system can be updated with data corresponding to the real situation to obtain a new optimum solution. The application of this method to the Northwest Forest of Madrid (Spain) is also described. This application allowed us to check that it is a helpful tool in the decision-making process.

  17. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    PubMed

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  18. Effects of event knowledge in processing verbal arguments

    PubMed Central

    Bicknell, Klinton; Elman, Jeffrey L.; Hare, Mary; McRae, Ken; Kutas, Marta

    2010-01-01

    This research tests whether comprehenders use their knowledge of typical events in real time to process verbal arguments. In self-paced reading and event-related brain potential (ERP) experiments, we used materials in which the likelihood of a specific patient noun (brakes or spelling) depended on the combination of an agent and verb (mechanic checked vs. journalist checked). Reading times were shorter at the word directly following the patient for the congruent than the incongruent items. Differential N400s were found earlier, immediately at the patient. Norming studies ruled out any account of these results based on direct relations between the agent and patient. Thus, comprehenders dynamically combine information about real-world events based on intrasentential agents and verbs, and this combination then rapidly influences online sentence interpretation. PMID:21076629

  19. Physical and JIT Model Based Hybrid Modeling Approach for Building Thermal Load Prediction

    NASA Astrophysics Data System (ADS)

    Iino, Yutaka; Murai, Masahiko; Murayama, Dai; Motoyama, Ichiro

    Energy conservation in building fields is one of the key issues in environmental point of view as well as that of industrial, transportation and residential fields. The half of the total energy consumption in a building is occupied by HVAC (Heating, Ventilating and Air Conditioning) systems. In order to realize energy conservation of HVAC system, a thermal load prediction model for building is required. This paper propose a hybrid modeling approach with physical and Just-in-Time (JIT) model for building thermal load prediction. The proposed method has features and benefits such as, (1) it is applicable to the case in which past operation data for load prediction model learning is poor, (2) it has a self checking function, which always supervises if the data driven load prediction and the physical based one are consistent or not, so it can find if something is wrong in load prediction procedure, (3) it has ability to adjust load prediction in real-time against sudden change of model parameters and environmental conditions. The proposed method is evaluated with real operation data of an existing building, and the improvement of load prediction performance is illustrated.

  20. Impact of scatterometer wind (ASCAT-A/B) data assimilation on semi real-time forecast system at KIAPS

    NASA Astrophysics Data System (ADS)

    Han, H. J.; Kang, J. H.

    2016-12-01

    Since Jul. 2015, KIAPS (Korea Institute of Atmospheric Prediction Systems) has been performing the semi real-time forecast system to assess the performance of their forecast system as a NWP model. KPOP (KIAPS Protocol for Observation Processing) is a part of KIAPS data assimilation system and has been performing well in KIAPS semi real-time forecast system. In this study, due to the fact that KPOP would be able to treat the scatterometer wind data, we analyze the effect of scatterometer wind (ASCAT-A/B) on KIAPS semi real-time forecast system. O-B global distribution and statistics of scatterometer wind give use two information which are the difference between background field and observation is not too large and KPOP processed the scatterometer wind data well. The changes of analysis increment because of O-B global distribution appear remarkably at the bottom of atmospheric field. It also shows that scatterometer wind data cover wide ocean where data would be able to short. Performance of scatterometer wind data can be checked through the vertical error reduction against IFS between background and analysis field and vertical statistics of O-A. By these analysis result, we can notice that scatterometer wind data will influence the positive effect on lower level performance of semi real-time forecast system at KIAPS. After, long-term result based on effect of scatterometer wind data will be analyzed.

  1. Real-Time Tomography Mooring

    DTIC Science & Technology

    1992-06-01

    Vi th this sampling schedule the data logger has enough (data. storage capacity for, a five yea r deploymeneit-. SYsteim specifica tionis are shown...sit [I t ille check is pwrformed as a virttial device. called -Ilinei check". which is scheduled in the task tabhe aid(] executed by the systsenm cot...PCI as FSK carrier detect input- add function (5) to do timer/counter control - stop counter when defaults set I 6. Instrument scheduler software

  2. Application of real-time engine simulations to the development of propulsion system controls

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.

    1975-01-01

    The development of digital controls for turbojet and turbofan engines is presented by the use of real-time computer simulations of the engines. The engine simulation provides a test-bed for evaluating new control laws and for checking and debugging control software and hardware prior to engine testing. The development and use of real-time, hybrid computer simulations of the Pratt and Whitney TF30-P-3 and F100-PW-100 augmented turbofans are described in support of a number of controls research programs at the Lewis Research Center. The role of engine simulations in solving the propulsion systems integration problem is also discussed.

  3. Heartbeat-based error diagnosis framework for distributed embedded systems

    NASA Astrophysics Data System (ADS)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2012-01-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  4. Heartbeat-based error diagnosis framework for distributed embedded systems

    NASA Astrophysics Data System (ADS)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2011-12-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  5. Use of posterior predictive checks as an inferential tool for investigating individual heterogeneity in animal population vital rates

    PubMed Central

    Chambert, Thierry; Rotella, Jay J; Higgs, Megan D

    2014-01-01

    The investigation of individual heterogeneity in vital rates has recently received growing attention among population ecologists. Individual heterogeneity in wild animal populations has been accounted for and quantified by including individually varying effects in models for mark–recapture data, but the real need for underlying individual effects to account for observed levels of individual variation has recently been questioned by the work of Tuljapurkar et al. (Ecology Letters, 12, 93, 2009) on dynamic heterogeneity. Model-selection approaches based on information criteria or Bayes factors have been used to address this question. Here, we suggest that, in addition to model-selection, model-checking methods can provide additional important insights to tackle this issue, as they allow one to evaluate a model's misfit in terms of ecologically meaningful measures. Specifically, we propose the use of posterior predictive checks to explicitly assess discrepancies between a model and the data, and we explain how to incorporate model checking into the inferential process used to assess the practical implications of ignoring individual heterogeneity. Posterior predictive checking is a straightforward and flexible approach for performing model checks in a Bayesian framework that is based on comparisons of observed data to model-generated replications of the data, where parameter uncertainty is incorporated through use of the posterior distribution. If discrepancy measures are chosen carefully and are relevant to the scientific context, posterior predictive checks can provide important information allowing for more efficient model refinement. We illustrate this approach using analyses of vital rates with long-term mark–recapture data for Weddell seals and emphasize its utility for identifying shortfalls or successes of a model at representing a biological process or pattern of interest. We show how posterior predictive checks can be used to strengthen inferences in ecological studies. We demonstrate the application of this method on analyses dealing with the question of individual reproductive heterogeneity in a population of Antarctic pinnipeds. PMID:24834335

  6. Is Advanced Real-Time Energy Metering Sufficient to Persuade People to Save Energy?

    NASA Astrophysics Data System (ADS)

    Ting, L.; Leite, H.; Ponce de Leão, T.

    2012-10-01

    In order to promote a low-carbon economy, EU citizens may soon be able to check their electricity consumption from smart meter. It is hoped that smart meter can, by providing real-time consumption and pricing information to residential users, help reducing demand for electricity. It is argued in this paper that, according the Elaborative Likelihood Model (ELM), these methods are most likely to be effective when consumers perceive the issue of energy conservation relevant to their lives. Nevertheless, some fundamental characteristics of these methods result in limited amount of perceived personal relevance; for instance, energy expenditure expense may be relatively small comparing to other household expenditure like mortgage and consumption information does not enhance interpersonal trust. In this paper, it is suggested that smart meter can apply the "nudge" approaches which respond to ELM as the use of simple rules to make decision, which include the change of feedback delivery and device design.

  7. Real-time sensor data validation

    NASA Technical Reports Server (NTRS)

    Bickmore, Timothy W.

    1994-01-01

    This report describes the status of an on-going effort to develop software capable of detecting sensor failures on rocket engines in real time. This software could be used in a rocket engine controller to prevent the erroneous shutdown of an engine due to sensor failures which would otherwise be interpreted as engine failures by the control software. The approach taken combines analytical redundancy with Bayesian belief networks to provide a solution which has well defined real-time characteristics and well-defined error rates. Analytical redundancy is a technique in which a sensor's value is predicted by using values from other sensors and known or empirically derived mathematical relations. A set of sensors and a set of relations among them form a network of cross-checks which can be used to periodically validate all of the sensors in the network. Bayesian belief networks provide a method of determining if each of the sensors in the network is valid, given the results of the cross-checks. This approach has been successfully demonstrated on the Technology Test Bed Engine at the NASA Marshall Space Flight Center. Current efforts are focused on extending the system to provide a validation capability for 100 sensors on the Space Shuttle Main Engine.

  8. Rapid method for controlling the correct labeling of products containing common octopus (Octopus vulgaris) and main substitute species (Eledone cirrhosa and Dosidicus gigas) by fast real-time PCR.

    PubMed

    Espiñeira, Montserrat; Vieites, Juan M

    2012-12-15

    The TaqMan real-time PCR has the highest potential for automation, therefore representing the currently most suitable method for screening, allowing the detection of fraudulent or unintentional mislabeling of species. This work describes the development of a real-time polymerase chain reaction (RT-PCR) system for the detection and identification of common octopus (Octopus vulgaris) and main substitute species (Eledone cirrhosa and Dosidicus gigas). This technique is notable for the combination of simplicity, speed, sensitivity and specificity in an homogeneous assay. The method can be applied to all kinds of products; fresh, frozen and processed, including those undergoing intensive processes of transformation. This methodology was validated to check how the degree of food processing affects the method and the detection of each species. Moreover, it was applied to 34 commercial samples to evaluate the labeling of products made from them. The methodology herein developed is useful to check the fulfillment of labeling regulations for seafood products and to verify traceability in commercial trade and for fisheries control. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. A new Bayesian Inference-based Phase Associator for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meier, Men-Andrin; Heaton, Thomas; Clinton, John; Wiemer, Stefan

    2013-04-01

    State of the art network-based Earthquake Early Warning (EEW) systems can provide warnings for large magnitude 7+ earthquakes. Although regions in the direct vicinity of the epicenter will not receive warnings prior to damaging shaking, real-time event characterization is available before the destructive S-wave arrival across much of the strongly affected region. In contrast, in the case of the more frequent medium size events, such as the devastating 1994 Mw6.7 Northridge, California, earthquake, providing timely warning to the smaller damage zone is more difficult. For such events the "blind zone" of current systems (e.g. the CISN ShakeAlert system in California) is similar in size to the area over which severe damage occurs. We propose a faster and more robust Bayesian inference-based event associator, that in contrast to the current standard associators (e.g. Earthworm Binder), is tailored to EEW and exploits information other than only phase arrival times. In particular, the associator potentially allows for reliable automated event association with as little as two observations, which, compared to the ShakeAlert system, would speed up the real-time characterizations by about ten seconds and thus reduce the blind zone area by up to 80%. We compile an extensive data set of regional and teleseismic earthquake and noise waveforms spanning a wide range of earthquake magnitudes and tectonic regimes. We pass these waveforms through a causal real-time filterbank with passband filters between 0.1 and 50Hz, and, updating every second from the event detection, extract the maximum amplitudes in each frequency band. Using this dataset, we define distributions of amplitude maxima in each passband as a function of epicentral distance and magnitude. For the real-time data, we pass incoming broadband and strong motion waveforms through the same filterbank and extract an evolving set of maximum amplitudes in each passband. We use the maximum amplitude distributions to check whether the incoming waveforms are consistent with amplitude and frequency patterns of local earthquakes by means of a maximum likelihood approach. If such a single-station event likelihood is larger than a predefined threshold value we check whether there are neighboring stations that also have single-station event likelihoods above the threshold. If this is the case for at least one other station, we evaluate whether the respective relative arrival times are in agreement with a common earthquake origin (assuming a simple velocity model and using an Equal Differential Time location scheme). Additionally we check if there are stations where, given the preliminary location, observations would be expected but were not reported ("not-yet-arrived data"). Together, the single-station event likelihood functions and the location likelihood function constitute the multi-station event likelihood function. This function can then be combined with various types of prior information (such as station noise levels, preceding seismicity, fault proximity, etc.) to obtain a Bayesian posterior distribution, representing the degree of belief that the ensemble of the current real-time observations correspond to a local earthquake, rather than to some other signal source irrelevant for EEW. Additional to the reduction of the blind zone size, this approach facilitates the eventual development of an end-to-end probabilistic framework for an EEW system that provides systematic real-time assessment of the risk of false alerts, which enables end users of EEW to implement damage mitigation strategies only above a specified certainty level.

  10. Probabilistic choice between symmetric disparities in motion stereo matching for a lateral navigation system

    NASA Astrophysics Data System (ADS)

    Ershov, Egor; Karnaukhov, Victor; Mozerov, Mikhail

    2016-02-01

    Two consecutive frames of a lateral navigation camera video sequence can be considered as an appropriate approximation to epipolar stereo. To overcome edge-aware inaccuracy caused by occlusion, we propose a model that matches the current frame to the next and to the previous ones. The positive disparity of matching to the previous frame has its symmetric negative disparity to the next frame. The proposed algorithm performs probabilistic choice for each matched pixel between the positive disparity and its symmetric disparity cost. A disparity map obtained by optimization over the cost volume composed of the proposed probabilistic choice is more accurate than the traditional left-to-right and right-to-left disparity maps cross-check. Also, our algorithm needs two times less computational operations per pixel than the cross-check technique. The effectiveness of our approach is demonstrated on synthetic data and real video sequences, with ground-truth value.

  11. Temporal logics and real time expert systems.

    PubMed

    Blom, J A

    1996-10-01

    This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert systems, where a fixed (and usually small) response time must be guaranteed. One such expert system, Fagan's VM, is reviewed, and a delineation is given of how to formally describe and reason with time in medical protocols. It is shown that Petri net theory is a useful tool to check the correctness of formalised protocols.

  12. BAT3 Analyzer: Real-Time Data Display and Interpretation Software for the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3)

    USGS Publications Warehouse

    Winston, Richard B.; Shapiro, Allen M.

    2007-01-01

    The BAT3 Analyzer provides real-time display and interpretation of fluid pressure responses and flow rates measured during geochemical sampling, hydraulic testing, or tracer testing conducted with the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3) (Shapiro, 2007). Real-time display of the data collected with the Multifunction BAT3 allows the user to ensure that the downhole apparatus is operating properly, and that test procedures can be modified to correct for unanticipated hydraulic responses during testing. The BAT3 Analyzer can apply calibrations to the pressure transducer and flow meter data to display physically meaningful values. Plots of the time-varying data can be formatted for a specified time interval, and either saved to files, or printed. Libraries of calibrations for the pressure transducers and flow meters can be created, updated and reloaded to facilitate the rapid set up of the software to display data collected during testing with the Multifunction BAT3. The BAT3 Analyzer also has the functionality to estimate calibrations for pressure transducers and flow meters using data collected with the Multifunction BAT3 in conjunction with corroborating check measurements. During testing with the Multifunction BAT3, and also after testing has been completed, hydraulic properties of the test interval can be estimated by comparing fluid pressure responses with model results; a variety of hydrogeologic conceptual models of the formation are available for interpreting fluid-withdrawal, fluid-injection, and slug tests.

  13. Airport take-off noise assessment aimed at identify responsible aircraft classes.

    PubMed

    Sanchez-Perez, Luis A; Sanchez-Fernandez, Luis P; Shaout, Adnan; Suarez-Guerra, Sergio

    2016-01-15

    Assessment of aircraft noise is an important task of nowadays airports in order to fight environmental noise pollution given the recent discoveries on the exposure negative effects on human health. Noise monitoring and estimation around airports mostly use aircraft noise signals only for computing statistical indicators and depends on additional data sources so as to determine required inputs such as the aircraft class responsible for noise pollution. In this sense, the noise monitoring and estimation systems have been tried to improve by creating methods for obtaining more information from aircraft noise signals, especially real-time aircraft class recognition. Consequently, this paper proposes a multilayer neural-fuzzy model for aircraft class recognition based on take-off noise signal segmentation. It uses a fuzzy inference system to build a final response for each class p based on the aggregation of K parallel neural networks outputs Op(k) with respect to Linear Predictive Coding (LPC) features extracted from K adjacent signal segments. Based on extensive experiments over two databases with real-time take-off noise measurements, the proposed model performs better than other methods in literature, particularly when aircraft classes are strongly correlated to each other. A new strictly cross-checked database is introduced including more complex classes and real-time take-off noise measurements from modern aircrafts. The new model is at least 5% more accurate with respect to previous database and successfully classifies 87% of measurements in the new database. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passarge, M; Fix, M K; Manser, P

    Purpose: To create and test an accurate EPID-frame-based VMAT QA metric to detect gross dose errors in real-time and to provide information about the source of error. Methods: A Swiss cheese model was created for an EPID-based real-time QA process. The system compares a treatmentplan- based reference set of EPID images with images acquired over each 2° gantry angle interval. The metric utilizes a sequence of independent consecutively executed error detection Methods: a masking technique that verifies infield radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment to quantify rotation, scaling andmore » translation; standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each test were determined. For algorithm testing, twelve different types of errors were selected to modify the original plan. Corresponding predictions for each test case were generated, which included measurement-based noise. Each test case was run multiple times (with different noise per run) to assess the ability to detect introduced errors. Results: Averaged over five test runs, 99.1% of all plan variations that resulted in patient dose errors were detected within 2° and 100% within 4° (∼1% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 91.5% were detected by the system within 2°. Based on the type of method that detected the error, determination of error sources was achieved. Conclusion: An EPID-based during-treatment error detection system for VMAT deliveries was successfully designed and tested. The system utilizes a sequence of methods to identify and prevent gross treatment delivery errors. The system was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of errors in real-time and indicate the error source. J. V. Siebers receives funding support from Varian Medical Systems.« less

  16. Model Checking Degrees of Belief in a System of Agents

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Primero, Giuseppe; Rungta, Neha

    2014-01-01

    Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.

  17. 75 FR 27406 - Airworthiness Directives; Bombardier, Inc. Model BD-100-1A10 (Challenger 300) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ... BD- 100 Time Limits/Maintenance Checks. The actions described in this service information are... Challenger 300 BD-100 Time Limits/Maintenance Checks. (1) For the new tasks identified in Bombardier TR 5-2... Requirements,'' in Part 2 of Chapter 5 of Bombardier Challenger 300 BD-100 Time Limits/ Maintenance Checks...

  18. Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor

    NASA Technical Reports Server (NTRS)

    deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert

    2001-01-01

    This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.

  19. 76 FR 53348 - Airworthiness Directives; BAE SYSTEMS (Operations) Limited Model BAe 146 Airplanes and Model Avro...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...

  20. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  1. Evaluation of a multiplex real-time PCR for detection of four bacterial agents commonly associated with bovine respiratory disease in bronchoalveolar lavage fluid.

    PubMed

    Wisselink, Henk J; Cornelissen, Jan B W J; van der Wal, Fimme J; Kooi, Engbert A; Koene, Miriam G; Bossers, Alex; Smid, Bregtje; de Bree, Freddy M; Antonis, Adriaan F G

    2017-07-13

    Pasteurella multocida, Mannheimia haemolytica, Histophilus somni and Trueperella pyogenes are four bacterial agents commonly associated with bovine respiratory disease (BRD). In this study a bacterial multiplex real-time PCR (the RespoCheck PCR) was evaluated for the detection in bronchoalveolar lavage fluid (BALF) of these four bacterial agents. The analytical sensitivity of the multiplex real-time PCR assay determined on purified DNA and on bacterial cells of the four target pathogens was one to ten fg DNA/assay and 4 × 10 -1 to 2 × 10 0  CFU/assay. The analytical specificity of the test was, as evaluated on a collection of 118 bacterial isolates, 98.3% for M. haemolytica and 100% for the other three target bacteria. A set of 160 BALF samples of calves originating from ten different herds with health problems related to BRD was examined with bacteriological methods and with the RespoCheck PCR. Using bacteriological examination as the gold standard, the diagnostic sensitivities and specificities of the four bacterial agents were respectively between 0.72 and 1.00 and between 0.70 and 0.99. Kappa values for agreement between results of bacteriological examination and PCRs were low for H. somni (0.17), moderate for P. multocida (0.52) and M. haemolytica (0.57), and good for T. pyogenes (0.79). The low and moderate kappa values seemed to be related to limitations of the bacteriological examination, this was especially the case for H. somni. It was concluded that the RespoCheck PCR assay is a valuable diagnostic tool for the simultaneous detection of the four bacterial agents in BALF of calves.

  2. The KATE shell: An implementation of model-based control, monitor and diagnosis

    NASA Technical Reports Server (NTRS)

    Cornell, Matthew

    1987-01-01

    The conventional control and monitor software currently used by the Space Center for Space Shuttle processing has many limitations such as high maintenance costs, limited diagnostic capabilities and simulation support. These limitations have caused the development of a knowledge based (or model based) shell to generically control and monitor electro-mechanical systems. The knowledge base describes the system's structure and function and is used by a software shell to do real time constraints checking, low level control of components, diagnosis of detected faults, sensor validation, automatic generation of schematic diagrams and automatic recovery from failures. This approach is more versatile and more powerful than the conventional hard coded approach and offers many advantages over it, although, for systems which require high speed reaction times or aren't well understood, knowledge based control and monitor systems may not be appropriate.

  3. Global Connections: Web Conferencing Tools Help Educators Collaborate Anytime, Anywhere

    ERIC Educational Resources Information Center

    Forrester, Dave

    2009-01-01

    Web conferencing tools help educators from around the world collaborate in real time. Teachers, school counselors, and administrators need only to put on their headsets, check the time zone, and log on to meet and learn from educators across the globe. In this article, the author discusses how educators can use Web conferencing at their schools.…

  4. Measures of Reliability in Behavioral Observation: The Advantage of "Real Time" Data Acquisition.

    ERIC Educational Resources Information Center

    Hollenbeck, Albert R.; Slaby, Ronald G.

    Two observers who were using an electronic digital data acquisition system were spot checked for reliability at random times over a four month period. Between-and within-observer reliability was assessed for frequency, duration, and duration-per-event measures of four infant behaviors. The results confirmed the problem of observer drift--the…

  5. A Model-based Approach to Controlling the ST-5 Constellation Lights-Out Using the GMSEC Message Bus and Simulink

    NASA Technical Reports Server (NTRS)

    Witt, Kenneth J.; Stanley, Jason; Shendock, Robert; Mandl, Daniel

    2005-01-01

    Space Technology 5 (ST-5) is a three-satellite constellation, technology validation mission under the New Millennium Program at NASA to be launched in March 2006. One of the key technologies to be validated is a lights-out, model-based operations approach to be used for one week to control the ST-5 constellation with no manual intervention. The ground architecture features the GSFC Mission Services Evolution Center (GMSEC) middleware, which allows easy plugging in of software components and a standardized messaging protocol over a software bus. A predictive modeling tool built on MatLab's Simulink software package makes use of the GMSEC standard messaging protocol to interface to the Advanced Mission Planning System (AMPS) Scenario Scheduler which controls all activities, resource allocation and real-time re-profiling of constellation resources when non-nominal events occur. The key features of this system, which we refer to as the ST-5 Simulink system, are as follows: Original daily plan is checked to make sure that predicted resources needed are available by comparing the plan against the model. As the plan is run in real-time, the system re-profiles future activities in real-time if planned activities do not occur in the predicted timeframe or fashion. Alert messages are sent out on the GMSEC bus by the system if future predicted problems are detected. This will allow the Scenario Scheduler to correct the situation before the problem happens. The predictive model is evolved automatically over time via telemetry updates thus reducing the cost of implementing and maintaining the models by an order of magnitude from previous efforts at GSFC such as the model-based system built for MAP in the mid-1990's. This paper will describe the key features, lessons learned and implications for future missions once this system is successfully validated on-orbit in 2006.

  6. After Delivery

    MedlinePlus

    ... snack or mealtime. Low blood glucose is a real danger. It's important for your baby's safety to avoid blood glucose reactions that could confuse you. For all of the above reasons, it is important to check your blood glucose often during this time. And your records of your blood glucose levels ...

  7. The Chandra Monitoring System

    NASA Astrophysics Data System (ADS)

    Wolk, S. J.; Petreshock, J. G.; Allen, P.; Bartholowmew, R. T.; Isobe, T.; Cresitello-Dittmar, M.; Dewey, D.

    The NASA Great Observatory Chandra was launched July 23, 1999 aboard the space shuttle Columbia. The Chandra Science Center (CXC) runs a monitoring and trends analysis program to maximize the science return from this mission. At the time of the launch, the monitoring portion of this system was in place. The system is a collection of multiple threads and programming methodologies acting cohesively. Real-time data are passed to the CXC. Our real-time tool, ACORN (A Comprehensive object-ORiented Necessity), performs limit checking of performance related hardware. Chandra is in ground contact less than 3 hours a day, so the bulk of the monitoring must take place on data dumped by the spacecraft. To do this, we have written several tools which run off of the CXC data system pipelines. MTA_MONITOR_STATIC, limit checks FITS files containing hardware data. MTA_EVENT_MON and MTA_GRAT_MON create quick look data for the focal place instruments and the transmission gratings. When instruments violate their operational limits, the responsible scientists are notified by email and problem tracking is initiated. Output from all these codes is distributed to CXC scientists via HTML interface.

  8. A generalized statistical model for the size distribution of wealth

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  9. Redundant and fault-tolerant algorithms for real-time measurement and control systems for weapon equipment.

    PubMed

    Li, Dan; Hu, Xiaoguang

    2017-03-01

    Because of the high availability requirements from weapon equipment, an in-depth study has been conducted on the real-time fault-tolerance of the widely applied Compact PCI (CPCI) bus measurement and control system. A redundancy design method that uses heartbeat detection to connect the primary and alternate devices has been developed. To address the low successful execution rate and relatively large waste of time slices in the primary version of the task software, an improved algorithm for real-time fault-tolerant scheduling is proposed based on the Basic Checking available time Elimination idle time (BCE) algorithm, applying a single-neuron self-adaptive proportion sum differential (PSD) controller. The experimental validation results indicate that this system has excellent redundancy and fault-tolerance, and the newly developed method can effectively improve the system availability. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Recent sequence variation in probe binding site affected detection of respiratory syncytial virus group B by real-time RT-PCR.

    PubMed

    Kamau, Everlyn; Agoti, Charles N; Lewa, Clement S; Oketch, John; Owor, Betty E; Otieno, Grieven P; Bett, Anne; Cane, Patricia A; Nokes, D James

    2017-03-01

    Direct immuno-fluorescence test (IFAT) and multiplex real-time RT-PCR have been central to RSV diagnosis in Kilifi, Kenya. Recently, these two methods showed discrepancies with an increasing number of PCR undetectable RSV-B viruses. Establish if mismatches in the primer and probe binding sites could have reduced real-time RT-PCR sensitivity. Nucleoprotein (N) and glycoprotein (G) genes were sequenced for real-time RT-PCR positive and negative samples. Primer and probe binding regions in N gene were checked for mismatches and phylogenetic analyses done to determine molecular epidemiology of these viruses. New primers and probe were designed and tested on the previously real-time RT-PCR negative samples. N gene sequences revealed 3 different mismatches in the probe target site of PCR negative, IFAT positive viruses. The primers target sites had no mismatches. Phylogenetic analysis of N and G genes showed that real-time RT-PCR positive and negative samples fell into distinct clades. Newly designed primers-probe pair improved detection and recovered previous PCR undetectable viruses. An emerging RSV-B variant is undetectable by a quite widely used real-time RT-PCR assay due to polymorphisms that influence probe hybridization affecting PCR accuracy. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. 77 FR 20520 - Airworthiness Directives; Bombardier, Inc. Model BD-100-1A10 (Challenger 300) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ... Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual. For this task, the initial compliance..., of Part 2, of the Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual, the general.../Maintenance Checks Manual, provided that the relevant information in the general revision is identical to that...

  12. GIS management system of power plant staff based on wireless fidelity indoor location technology

    NASA Astrophysics Data System (ADS)

    Zhang, Ting

    2017-05-01

    The labor conditions and environment of electric power production are quite complicated. It is very difficult to realize the real-time supervision of the employees' working conditions and safety. Using the existing base stations in the power plant, the wireless fidelity network is established to realize the wireless coverage of the work site. We can use mobile phone to communicate and achieve positioning. The main content of this project is based on the special environment of the power plant, designed a suitable for ordinary Android mobile phone indoor wireless fidelity positioning system, real-time positioning and record the scene of each employee's movement trajectory, has achieved real-time staff check Gang, Staff in place, and for the safety of employees to provide a guarantee.

  13. An Envelope Based Feedback Control System for Earthquake Early Warning: Reality Check Algorithm

    NASA Astrophysics Data System (ADS)

    Heaton, T. H.; Karakus, G.; Beck, J. L.

    2016-12-01

    Earthquake early warning systems are, in general, designed to be open loop control systems in such a way that the output, i.e., the warning messages, only depend on the input, i.e., recorded ground motions, up to the moment when the message is issued in real-time. We propose an algorithm, which is called Reality Check Algorithm (RCA), which would assess the accuracy of issued warning messages, and then feed the outcome of the assessment back into the system. Then, the system would modify its messages if necessary. That is, we are proposing to convert earthquake early warning systems into feedback control systems by integrating them with RCA. RCA works by continuously monitoring and comparing the observed ground motions' envelopes to the predicted envelopes of Virtual Seismologist (Cua 2005). Accuracy of magnitude and location (both spatial and temporal) estimations of the system are assessed separately by probabilistic classification models, which are trained by a Sparse Bayesian Learning technique called Automatic Relevance Determination prior.

  14. Adaptive Resource Utilization Prediction System for Infrastructure as a Service Cloud.

    PubMed

    Zia Ullah, Qazi; Hassan, Shahzad; Khan, Gul Muhammad

    2017-01-01

    Infrastructure as a Service (IaaS) cloud provides resources as a service from a pool of compute, network, and storage resources. Cloud providers can manage their resource usage by knowing future usage demand from the current and past usage patterns of resources. Resource usage prediction is of great importance for dynamic scaling of cloud resources to achieve efficiency in terms of cost and energy consumption while keeping quality of service. The purpose of this paper is to present a real-time resource usage prediction system. The system takes real-time utilization of resources and feeds utilization values into several buffers based on the type of resources and time span size. Buffers are read by R language based statistical system. These buffers' data are checked to determine whether their data follows Gaussian distribution or not. In case of following Gaussian distribution, Autoregressive Integrated Moving Average (ARIMA) is applied; otherwise Autoregressive Neural Network (AR-NN) is applied. In ARIMA process, a model is selected based on minimum Akaike Information Criterion (AIC) values. Similarly, in AR-NN process, a network with the lowest Network Information Criterion (NIC) value is selected. We have evaluated our system with real traces of CPU utilization of an IaaS cloud of one hundred and twenty servers.

  15. Adaptive Resource Utilization Prediction System for Infrastructure as a Service Cloud

    PubMed Central

    Hassan, Shahzad; Khan, Gul Muhammad

    2017-01-01

    Infrastructure as a Service (IaaS) cloud provides resources as a service from a pool of compute, network, and storage resources. Cloud providers can manage their resource usage by knowing future usage demand from the current and past usage patterns of resources. Resource usage prediction is of great importance for dynamic scaling of cloud resources to achieve efficiency in terms of cost and energy consumption while keeping quality of service. The purpose of this paper is to present a real-time resource usage prediction system. The system takes real-time utilization of resources and feeds utilization values into several buffers based on the type of resources and time span size. Buffers are read by R language based statistical system. These buffers' data are checked to determine whether their data follows Gaussian distribution or not. In case of following Gaussian distribution, Autoregressive Integrated Moving Average (ARIMA) is applied; otherwise Autoregressive Neural Network (AR-NN) is applied. In ARIMA process, a model is selected based on minimum Akaike Information Criterion (AIC) values. Similarly, in AR-NN process, a network with the lowest Network Information Criterion (NIC) value is selected. We have evaluated our system with real traces of CPU utilization of an IaaS cloud of one hundred and twenty servers. PMID:28811819

  16. Solutions to time variant problems of real-time expert systems

    NASA Technical Reports Server (NTRS)

    Yeh, Show-Way; Wu, Chuan-Lin; Hung, Chaw-Kwei

    1988-01-01

    Real-time expert systems for monitoring and control are driven by input data which changes with time. One of the subtle problems of this field is the propagation of time variant problems from rule to rule. This propagation problem is even complicated under a multiprogramming environment where the expert system may issue test commands to the system to get data and to access time consuming devices to retrieve data for concurrent reasoning. Two approaches are used to handle the flood of input data. Snapshots can be taken to freeze the system from time to time. The expert system treats the system as a stationary one and traces changes by comparing consecutive snapshots. In the other approach, when an input is available, the rules associated with it are evaluated. For both approaches, if the premise condition of a fired rule is changed to being false, the downstream rules should be deactivated. If the status change is due to disappearance of a transient problem, actions taken by the fired downstream rules which are no longer true may need to be undone. If a downstream rule is being evaluated, it should not be fired. Three mechanisms for solving this problem are discussed: tracing, backward checking, and censor setting. In the forward tracing mechanism, when the premise conditions of a fired rule become false, the premise conditions of downstream rules which have been fired or are being evaluated due to the firing of that rule are reevaluated. A tree with its root at the rule being deactivated is traversed. In the backward checking mechanism, when a rule is being fired, the expert system checks back on the premise conditions of the upstream rules that result in evaluation of the rule to see whether it should be fired. The root of the tree being traversed is the rule being fired. In the censor setting mechanism, when a rule is to be evaluated, a censor is constructed based on the premise conditions of the upstream rules and the censor is evaluated just before the rule is fired. Unlike the backward checking mechanism, this one does not search the upstream rules. This paper explores the details of implementation of the three mechanisms.

  17. Finding Feasible Abstract Counter-Examples

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  18. Real-Time, General-Purpose, High-Speed Signal Processing Systems for Underwater Research. Proceedings of a Working Level Conference held at Supreme Allied Commander, Atlantic Anti-Submarine Warfare Research Center (SACLANTCEN) on 18-21 September 1979. Part 2. Sessions IV to VI.

    DTIC Science & Technology

    1979-12-01

    ACTIVATED, SYSTEM OPERATION AND TESTING MASCOT PROVIDES: 1. SYSTEM BUILD SOFTWARE COMPILE-TIME CHECKS,a. 2. RUN-TIME SUPERVISOR KERNEL, 3, MONITOR AND...p AD-AOBI 851 SACLANT ASW RESEARCH CENTRE LA SPEZIA 11ITALY) F/B 1711 REAL-TIME, GENERAL-PURPOSE, HIGH-SPEED SIGNAL PROCESSING SYSTEM -- ETC (U) DEC 79...Table of Contents Table of Contents (Cont’d) Page Signal processing language and operating system (w) 23-1 to 23-12 by S. Weinstein A modular signal

  19. Improved near real-time data management procedures for the Mediterranean ocean Forecasting System-Voluntary Observing Ship program

    NASA Astrophysics Data System (ADS)

    Manzella, G. M. R.; Scoccimarro, E.; Pinardi, N.; Tonani, M.

    2003-01-01

    A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000), six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT), was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1) Position control; (2) Elimination of spikes; (3) Re-sampling at a 1 metre vertical interval; (4) Filtering; (5) General malfunctioning check; (6) Comparison with climatology (and distance from this in terms of standard deviations); (7) Visual check; and (8) Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment.

  20. Non-invasive quality evaluation of confluent cells by image-based orientation heterogeneity analysis.

    PubMed

    Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji

    2016-02-01

    In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  1. Wearable physiological sensors and real-time algorithms for detection of acute mountain sickness.

    PubMed

    Muza, Stephen R

    2018-03-01

    This is a minireview of potential wearable physiological sensors and algorithms (process and equations) for detection of acute mountain sickness (AMS). Given the emerging status of this effort, the focus of the review is on the current clinical assessment of AMS, known risk factors (environmental, demographic, and physiological), and current understanding of AMS pathophysiology. Studies that have examined a range of physiological variables to develop AMS prediction and/or detection algorithms are reviewed to provide insight and potential technological roadmaps for future development of real-time physiological sensors and algorithms to detect AMS. Given the lack of signs and nonspecific symptoms associated with AMS, development of wearable physiological sensors and embedded algorithms to predict in the near term or detect established AMS will be challenging. Prior work using [Formula: see text], HR, or HRv has not provided the sensitivity and specificity for useful application to predict or detect AMS. Rather than using spot checks as most prior studies have, wearable systems that continuously measure SpO 2 and HR are commercially available. Employing other statistical modeling approaches such as general linear and logistic mixed models or time series analysis to these continuously measured variables is the most promising approach for developing algorithms that are sensitive and specific for physiological prediction or detection of AMS.

  2. Spectral imaging based in vivo model system for characterization of tumor microvessel response to vascular targeting agents

    NASA Astrophysics Data System (ADS)

    Wankhede, Mamta

    Functional vasculature is vital for tumor growth, proliferation, and metastasis. Many tumor-specific vascular targeting agents (VTAs) aim to destroy this essential tumor vasculature to induce indirect tumor cell death via oxygen and nutrition deprivation. The tumor angiogenesis-inhibiting anti-angiogenics (AIs) and the established tumor vessel targeting vascular disrupting agents (VDAs) are the two major players in the vascular targeting field. Combination of VTAs with conventional therapies or with each other, have been shown to have additive or supra-additive effects on tumor control and treatment. Pathophysiological changes post-VTA treatment in terms of structural and vessel function changes are important parameters to characterize the treatment efficacy. Despite the abundance of information regarding these parameters acquired using various techniques, there remains a need for a quantitative, real-time, and direct observation of these phenomenon in live animals. Through this research we aspired to develop a spectral imaging based mouse tumor system for real-time in vivo microvessel structure and functional measurements for VTA characterization. A model tumor system for window chamber studies was identified, and then combinatorial effects of VDA and AI were characterized in model tumor system. (Full text of this dissertation may be available via the University of Florida Libraries web site. Please check http://www.uflib.ufl.edu/etd.html)

  3. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  4. Identifying fMRI Model Violations with Lagrange Multiplier Tests

    PubMed Central

    Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor

    2013-01-01

    The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665

  5. Rewriting Modulo SMT and Open System Analysis

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  6. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  7. Airport trial of a system for the mass screening of baggage or cargo

    NASA Astrophysics Data System (ADS)

    Bennett, Gordon; Sleeman, Richard; Davidson, William R.; Stott, William R.

    1994-10-01

    An eight month trial of a system capable of checking every bag from a particular flight for the presence of narcotics has been carried out at a major UK airport. The British Aerospace CONDOR tandem mass-spectrometer system, fitted with a real-time sampler, was used to check in-coming baggage for a range of illegal drugs. Because of the rapid sampling and analysis capability of this instrument, it was possible to check every bag from a flight without delay to the passengers. During the trial a very large number of bags, from flights from various parts of the world, were sampled. A number of detections were made, which resulted in a number of seizures and the apprehension of a number of smugglers.

  8. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  9. Environmental Reality Check.

    ERIC Educational Resources Information Center

    Manicone, Santo

    2001-01-01

    Discusses the importance of educational facilities conducting "reality check" self-audits to uncover the real truth behind underlying environmental problems. An environmental compliance multimedia checklist is included. (GR)

  10. Toward improved design of check dam systems: A case study in the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua

    2018-04-01

    Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.

  11. Posterior Predictive Checks for Conditional Independence between Response Time and Accuracy

    ERIC Educational Resources Information Center

    Bolsinova, Maria; Tijmstra, Jesper

    2016-01-01

    Conditional independence (CI) between response time and response accuracy is a fundamental assumption of many joint models for time and accuracy used in educational measurement. In this study, posterior predictive checks (PPCs) are proposed for testing this assumption. These PPCs are based on three discrepancy measures reflecting different…

  12. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  13. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.

  14. Formal semantic specifications as implementation blueprints for real-time programming languages

    NASA Technical Reports Server (NTRS)

    Feyock, S.

    1981-01-01

    Formal definitions of language and system semantics provide highly desirable checks on the correctness of implementations of programming languages and their runtime support systems. If these definitions can give concrete guidance to the implementor, major increases in implementation accuracy and decreases in implementation effort can be achieved. It is shown that of the wide variety of available methods the Hgraph (hypergraph) definitional technique (Pratt, 1975), is best suited to serve as such an implementation blueprint. A discussion and example of the Hgraph technique is presented, as well as an overview of the growing body of implementation experience of real-time languages based on Hgraph semantic definitions.

  15. Do alcohol excise taxes affect traffic accidents? Evidence from Estonia.

    PubMed

    Saar, Indrek

    2015-01-01

    This article examines the association between alcohol excise tax rates and alcohol-related traffic accidents in Estonia. Monthly time series of traffic accidents involving drunken motor vehicle drivers from 1998 through 2013 were regressed on real average alcohol excise tax rates while controlling for changes in economic conditions and the traffic environment. Specifically, regression models with autoregressive integrated moving average (ARIMA) errors were estimated in order to deal with serial correlation in residuals. Counterfactual models were also estimated in order to check the robustness of the results, using the level of non-alcohol-related traffic accidents as a dependent variable. A statistically significant (P <.01) strong negative relationship between the real average alcohol excise tax rate and alcohol-related traffic accidents was disclosed under alternative model specifications. For instance, the regression model with ARIMA (0, 1, 1)(0, 1, 1) errors revealed that a 1-unit increase in the tax rate is associated with a 1.6% decrease in the level of accidents per 100,000 population involving drunk motor vehicle drivers. No similar association was found in the cases of counterfactual models for non-alcohol-related traffic accidents. This article indicates that the level of alcohol-related traffic accidents in Estonia has been affected by changes in real average alcohol excise taxes during the period 1998-2013. Therefore, in addition to other measures, the use of alcohol taxation is warranted as a policy instrument in tackling alcohol-related traffic accidents.

  16. Real-Time Processing of Continuous Physiological Signals in a Neurocritical Care Unit on a Stream Data Analytics Platform.

    PubMed

    Bai, Yong; Sow, Daby; Vespa, Paul; Hu, Xiao

    2016-01-01

    Continuous high-volume and high-frequency brain signals such as intracranial pressure (ICP) and electroencephalographic (EEG) waveforms are commonly collected by bedside monitors in neurocritical care. While such signals often carry early signs of neurological deterioration, detecting these signs in real time with conventional data processing methods mainly designed for retrospective analysis has been extremely challenging. Such methods are not designed to handle the large volumes of waveform data produced by bedside monitors. In this pilot study, we address this challenge by building a prototype system using the IBM InfoSphere Streams platform, a scalable stream computing platform, to detect unstable ICP dynamics in real time. The system continuously receives electrocardiographic and ICP signals and analyzes ICP pulse morphology looking for deviations from a steady state. We also designed a Web interface to display in real time the result of this analysis in a Web browser. With this interface, physicians are able to ubiquitously check on the status of their patients and gain direct insight into and interpretation of the patient's state in real time. The prototype system has been successfully tested prospectively on live hospitalized patients.

  17. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  18. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  19. Negative Stress Margins - Are They Real?

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Lee, Darlene S.; Mohaghegh, Michael

    2011-01-01

    Advances in modeling and simulation, new finite element software, modeling engines and powerful computers are providing opportunities to interrogate designs in a very different manner and in a more detailed approach than ever before. Margins of safety are also often evaluated using local stresses for various design concepts and design parameters quickly once analysis models are defined and developed. This paper suggests that not all the negative margins of safety evaluated are real. The structural areas where negative margins are frequently encountered are often near stress concentrations, point loads and load discontinuities, near locations of stress singularities, in areas having large gradients but with insufficient mesh density, in areas with modeling issues and modeling errors, and in areas with connections and interfaces, in two-dimensional (2D) and three-dimensional (3D) transitions, bolts and bolt modeling, and boundary conditions. Now, more than ever, structural analysts need to examine and interrogate their analysis results and perform basic sanity checks to determine if these negative margins are real.

  20. Accident diagnosis system based on real-time decision tree expert system

    NASA Astrophysics Data System (ADS)

    Nicolau, Andressa dos S.; Augusto, João P. da S. C.; Schirru, Roberto

    2017-06-01

    Safety is one of the most studied topics when referring to power stations. For that reason, sensors and alarms develop an important role in environmental and human protection. When abnormal event happens, it triggers a chain of alarms that must be, somehow, checked by the control room operators. In this case, diagnosis support system can help operators to accurately identify the possible root-cause of the problem in short time. In this article, we present a computational model of a generic diagnose support system based on artificial intelligence, that was applied on the dataset of two real power stations: Angra1 Nuclear Power Plant and Santo Antônio Hydroelectric Plant. The proposed system processes all the information logged in the sequence of events before a shutdown signal using the expert's knowledge inputted into an expert system indicating the chain of events, from the shutdown signal to its root-cause. The results of both applications showed that the support system is a potential tool to help the control room operators identify abnormal events, as accidents and consequently increase the safety.

  1. Perpetual Model Validation

    DTIC Science & Technology

    2017-03-01

    models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems

  2. Model Checking JAVA Programs Using Java Pathfinder

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Pressburger, Thomas

    2000-01-01

    This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.

  3. Quantitative description and modeling of real networks

    NASA Astrophysics Data System (ADS)

    Capocci, Andrea; Caldarelli, Guido; de Los Rios, Paolo

    2003-10-01

    We present data analysis and modeling of two particular cases of study in the field of growing networks. We analyze World Wide Web data set and authorship collaboration networks in order to check the presence of correlation in the data. The results are reproduced with good agreement through a suitable modification of the standard Albert-Barabási model of network growth. In particular, intrinsic relevance of sites plays a role in determining the future degree of the vertex.

  4. Fast computation of the multivariable stability margin for real interrelated uncertain parameters

    NASA Technical Reports Server (NTRS)

    Sideris, Athanasios; Sanchez Pena, Ricardo S.

    1988-01-01

    A novel algorithm for computing the multivariable stability margin for checking the robust stability of feedback systems with real parametric uncertainty is proposed. This method eliminates the need for the frequency search involved in another given algorithm by reducing it to checking a finite number of conditions. These conditions have a special structure, which allows a significant improvement on the speed of computations.

  5. Methods of practice and guidelines for using survey-grade global navigation satellite systems (GNSS) to establish vertical datum in the United States Geological Survey

    USGS Publications Warehouse

    Rydlund, Jr., Paul H.; Densmore, Brenda K.

    2012-01-01

    Geodetic surveys have evolved through the years to the use of survey-grade (centimeter level) global positioning to perpetuate and post-process vertical datum. The U.S. Geological Survey (USGS) uses Global Navigation Satellite Systems (GNSS) technology to monitor natural hazards, ensure geospatial control for climate and land use change, and gather data necessary for investigative studies related to water, the environment, energy, and ecosystems. Vertical datum is fundamental to a variety of these integrated earth sciences. Essentially GNSS surveys provide a three-dimensional position x, y, and z as a function of the North American Datum of 1983 ellipsoid and the most current hybrid geoid model. A GNSS survey may be approached with post-processed positioning for static observations related to a single point or network, or involve real-time corrections to provide positioning "on-the-fly." Field equipment required to facilitate GNSS surveys range from a single receiver, with a power source for static positioning, to an additional receiver or network communicated by radio or cellular for real-time positioning. A real-time approach in its most common form may be described as a roving receiver augmented by a single-base station receiver, known as a single-base real-time (RT) survey. More efficient real-time methods involving a Real-Time Network (RTN) permit the use of only one roving receiver that is augmented to a network of fixed receivers commonly known as Continually Operating Reference Stations (CORS). A post-processed approach in its most common form involves static data collection at a single point. Data are most commonly post-processed through a universally accepted utility maintained by the National Geodetic Survey (NGS), known as the Online Position User Service (OPUS). More complex post-processed methods involve static observations among a network of additional receivers collecting static data at known benchmarks. Both classifications provide users flexibility regarding efficiency and quality of data collection. Quality assurance of survey-grade global positioning is often overlooked or not understood and perceived uncertainties can be misleading. GNSS users can benefit from a blueprint of data collection standards used to ensure consistency among USGS mission areas. A classification of GNSS survey qualities provide the user with the ability to choose from the highest quality survey used to establish objective points with low uncertainties, identified as a Level I, to a GNSS survey for general topographic control without quality assurance, identified as a Level IV. A Level I survey is strictly limited to post-processed methods, whereas Level II, Level III, and Level IV surveys integrate variations of a RT approach. Among these classifications, techniques involving blunder checks and redundancy are important, and planning that involves the assessment of the overall satellite configuration, as well as terrestrial and space weather, are necessary to ensure an efficient and quality campaign. Although quality indicators and uncertainties are identified in post-processed methods using CORS, the accuracy of a GNSS survey is most effectively expressed as a comparison to a local benchmark that has a high degree of confidence. Real-time and post-processed methods should incorporate these "trusted" benchmarks as a check during any campaign. Global positioning surveys are expected to change rapidly in the future. The expansion of continuously operating reference stations, combined with newly available satellite signals, and enhancements to the conterminous geoid, are all sufficient indicators for substantial growth in real-time positioning and quality thereof.

  6. Finding the joker among the maize endogenous reference genes for genetically modified organism (GMO) detection.

    PubMed

    Paternò, Annalisa; Marchesi, Ugo; Gatto, Francesco; Verginelli, Daniela; Quarchioni, Cinzia; Fusco, Cristiana; Zepparoni, Alessia; Amaddeo, Demetrio; Ciabatti, Ilaria

    2009-12-09

    The comparison of five real-time polymerase chain reaction (PCR) methods targeted at maize ( Zea mays ) endogenous sequences is reported. PCR targets were the alcohol dehydrogenase (adh) gene for three methods and high-mobility group (hmg) gene for the other two. The five real-time PCR methods have been checked under repeatability conditions at several dilution levels on both pooled DNA template from several genetically modified (GM) maize certified reference materials (CRMs) and single CRM DNA extracts. Slopes and R(2) coefficients of all of the curves obtained from the adopted regression model were compared within the same method and among all of the five methods, and the limit of detection and limit of quantitation were analyzed for each PCR system. Furthermore, method equivalency was evaluated on the basis of the ability to estimate the target haploid genome copy number at each concentration level. Results indicated that, among the five methods tested, one of the hmg-targeted PCR systems can be considered equivalent to the others but shows the best regression parameters and a higher repeteability along the dilution range. Thereby, it is proposed as a valid module to be coupled to different event-specific real-time PCR for maize genetically modified organism (GMO) quantitation. The resulting practicability improvement on the analytical control of GMOs is discussed.

  7. Bioinformatic investigation of the role of ubiquitins in cucumber flower morphogenesis

    NASA Astrophysics Data System (ADS)

    Pawełkowicz, Magdalena; Osipowski, Paweł; Wojcieszek, Michał; Kowalczuk, Cezary; PlÄ der, Wojciech; Przybecki, Zbigniew

    2016-09-01

    Three cDNA clones were used to screen cucumber genome in order to find genes and proteins. Functional annotation reveals that they are correlated with ubiquitination pathways. Various bioinformatics tools were used to screen and check protein sequences features such as: the presence of specific domains, transmembrane regions, cleavage site and cellular placement. The computational analysis for promotor region shows many binding sites for transcription factors, which could regulate the expression of genes. In order to check gene expression levels in developing flower buds of monoecious (B10) and gynoecious (2gg) cucumber lines, the real - time PCR technique was applied. The expression was checked for the whole buds and only for the 3rd and 4th whorls of bud when generative organ are form which were obtained by Laser Capture Microdissection (LCM) technique.

  8. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    PubMed

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  9. 26 CFR 301.7506-1 - Administration of real estate acquired by the United States.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... methods will enhance the possibility of obtaining a higher price for the property. (3) Time and place of... cases, the district director may also require such persons to make deposits to secure the performance of... treasurer's check drawn on any bank or trust company incorporated under the laws of the United States or...

  10. Computational Fact Checking by Mining Knowledge Graphs

    ERIC Educational Resources Information Center

    Shiralkar, Prashant

    2017-01-01

    Misinformation and rumors have become rampant on online social platforms with adverse consequences for the real world. Fact-checking efforts are needed to mitigate the risks associated with the spread of digital misinformation. However, the pace at which information is generated online limits the capacity to fact-check claims at the same rate…

  11. Atomic Approaches to Defect Thermochemistry

    DTIC Science & Technology

    1992-04-30

    from the enthalpy of melting of ison with real experiments by a factor of Au to be 29 meV. (We have checked that the 2.1x10 3; the time scale of the...Diffusion and to Map Vacancy Concentrations at a Fixed Time V. Studies of Electroluminescent Flat-Panel Display Devices VI. Defect Characterization VII...kT), where n = ND - NA is the doping density, about the same time that P. Mei et al. published the first experimental report of this effect (Appl. Phys

  12. Understanding the modeling skill shift in engineering: the impact of self-efficacy, epistemology, and metacognition

    NASA Astrophysics Data System (ADS)

    Yildirim, Tuba Pinar

    A focus of engineering education is to prepare future engineers with problem solving, design and modeling skills. In engineering education, the former two skill areas have received copious attention making their way into the ABET criteria. Modeling, a representation containing the essential structure of an event in the real world, is a fundamental function of engineering, and an important academic skill that students develop during their undergraduate education. Yet the modeling process remains under-investigated, particularly in engineering, even though there is an increasing emphasis on modeling in engineering schools (Frey 2003). Research on modeling requires a deep understanding of multiple perspectives, that of cognition, affect, and knowledge expansion. In this dissertation, the relationship between engineering modeling skills and students' cognitive backgrounds including self-efficacy, epistemic beliefs and metacognition is investigated using model-eliciting activities (MEAs). Data were collected from sophomore students at two time periods, as well as senior engineering students. The impact of each cognitive construct on change in modeling skills was measured using a growth curve model at the sophomore level, and ordinary least squares regression at the senior level. Findings of this dissertation suggest that self-efficacy, through its direct and indirect (moderation or interaction term with time) impact, influences the growth of modeling abilities of an engineering student. When sophomore and senior modeling abilities are compared, the difference can be explained by varying self-efficacy levels. Epistemology influences modeling skill development such that the more sophisticated the student beliefs are, the higher the level of modeling ability students can attain, after controlling for the effects of conceptual learning, gender and GPA. This suggests that development of modeling ability may be constrained by the naivete of one's personal epistemology. Finally, metacognition, or 'thinking about thinking', has an impact on the development of modeling strategies of students, when the impacts of four metacognitive dimensions are considered: awareness, planning, cognitive strategy and self-checking. Students who are better at self-checking show higher growth in their modeling abilities over the course of a year, compared to students who are less proficient at self-checking. The growth in modeling abilities is also moderated by the cognitive strategy and planning skills of the student. After some experience with modeling is attained, students who have enhanced skills in these two metacognitive dimensions are observed to do better in modeling. Therefore, inherent metacognitive abilities of students can positively affect the growth of modeling ability.

  13. Radar-driven High-resolution Hydrometeorological Forecasts of the 26 September 2007 Venice flash flood

    NASA Astrophysics Data System (ADS)

    Massimo Rossa, Andrea; Laudanna Del Guerra, Franco; Borga, Marco; Zanon, Francesco; Settin, Tommaso; Leuenberger, Daniel

    2010-05-01

    Space and time scales of flash floods are such that flash flood forecasting and warning systems depend upon the accurate real-time provision of rainfall information, high-resolution numerical weather prediction (NWP) forecasts and the use of hydrological models. Currently available high-resolution NWP model models can potentially provide warning forecasters information on the future evolution of storms and their internal structure, thereby increasing convective-scale warning lead times. However, it is essential that the model be started with a very accurate representation of on-going convection, which calls for assimilation of high-resolution rainfall data. This study aims to assess the feasibility of using carefully checked radar-derived quantitative precipitation estimates (QPE) for assimilation into NWP and hydrological models. The hydrometeorological modeling chain includes the convection-permitting NWP model COSMO-2 and a hydrologic-hydraulic models built upon the concept of geomorphological transport. Radar rainfall observations are assimilated into the NWP model via the latent heat nudging method. The study is focused on 26 September 2007 extreme flash flood event which impacted the coastal area of north-eastern Italy around Venice. The hydro-meteorological modeling system is implemented over the Dese river, a 90 km2 catchment flowing to the Venice lagoon. The radar rainfall observations are carefully checked for artifacts, including beam attenuation, by means of physics-based correction procedures and comparison with a dense network of raingauges. The impact of the radar QPE in the assimilation cycle of the NWP model is very significant, in that the main individual organized convective systems were successfully introduced into the model state, both in terms of timing and localization. Also, incorrectly localized precipitation in the model reference run without rainfall assimilation was correctly reduced to about the observed levels. On the other hand, the highest rainfall intensities were underestimated by 20% at a scale of 1000 km2, and the local peaks by 50%. The positive impact of the assimilated radar rainfall was carried over into the free forecast for about 2-5 hours, depending on when this forecast was started, and was larger, when the main mesoscale convective system was present in the initial conditions. The improvements of the meteorological model simulations were directly propagated to the river flow simulations, with an extension of the warning lead time up to three hours.

  14. PKIX Certificate Status in Hybrid MANETs

    NASA Astrophysics Data System (ADS)

    Muñoz, Jose L.; Esparza, Oscar; Gañán, Carlos; Parra-Arnau, Javier

    Certificate status validation is a hard problem in general but it is particularly complex in Mobile Ad-hoc Networks (MANETs) because we require solutions to manage both the lack of fixed infrastructure inside the MANET and the possible absence of connectivity to trusted authorities when the certification validation has to be performed. In this sense, certificate acquisition is usually assumed as an initialization phase. However, certificate validation is a critical operation since the node needs to check the validity of certificates in real-time, that is, when a particular certificate is going to be used. In such MANET environments, it may happen that the node is placed in a part of the network that is disconnected from the source of status data at the moment the status checking is required. Proposals in the literature suggest the use of caching mechanisms so that the node itself or a neighbour node has some status checking material (typically on-line status responses or lists of revoked certificates). However, to the best of our knowledge the only criterion to evaluate the cached (obsolete) material is the time. In this paper, we analyse how to deploy a certificate status checking PKI service for hybrid MANET and we propose a new criterion based on risk to evaluate cached status data that is much more appropriate and absolute than time because it takes into account the revocation process.

  15. Techniques used for the analysis of oculometer eye-scanning data obtained from an air traffic control display

    NASA Technical Reports Server (NTRS)

    Crawford, Daniel J.; Burdette, Daniel W.; Capron, William R.

    1993-01-01

    The methodology and techniques used to collect and analyze look-point position data from a real-time ATC display-format comparison experiment are documented. That study compared the delivery precision and controller workload of three final approach spacing aid display formats. Using an oculometer, controller lookpoint position data were collected, associated with gaze objects (e.g., moving aircraft) on the ATC display, and analyzed to determine eye-scan behavior. The equipment involved and algorithms for saving, synchronizing with the ATC simulation output, and filtering the data are described. Target (gaze object) and cross-check scanning identification algorithms are also presented. Data tables are provided of total dwell times, average dwell times, and cross-check scans. Flow charts, block diagrams, file record descriptors, and source code are included. The techniques and data presented are intended to benefit researchers in other studies that incorporate non-stationary gaze objects and oculometer equipment.

  16. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  17. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  18. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  19. Formal Verification of the Runway Safety Monitor

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu; Ciardo, Gianfranco

    2006-01-01

    The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce runway accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bautista, Julian E.; Busca, Nicolas G.; Bailey, Stephen

    We describe mock data-sets generated to simulate the high-redshift quasar sample in Data Release 11 (DR11) of the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS). The mock spectra contain Lyα forest correlations useful for studying the 3D correlation function including Baryon Acoustic Oscillations (BAO). They also include astrophysical effects such as quasar continuum diversity and high-density absorbers, instrumental effects such as noise and spectral resolution, as well as imperfections introduced by the SDSS pipeline treatment of the raw data. The Lyα forest BAO analysis of the BOSS collaboration, described in Delubac et al. 2014, has used these mock data-sets to developmore » and cross-check analysis procedures prior to performing the BAO analysis on real data, and for continued systematic cross checks. Tests presented here show that the simulations reproduce sufficiently well important characteristics of real spectra. These mock data-sets will be made available together with the data at the time of the Data Release 11.« less

  1. Rapid, cost-effective, sensitive and quantitative detection of Acinetobacter baumannii from pneumonia patients

    PubMed Central

    Nomanpour, B; Ghodousi, A; Babaei, A; Abtahi, HR; Tabrizi, M; Feizabadi, MM

    2011-01-01

    Background and Objectives Pneumonia with Acinetobacter baumannii has a major therapeutic problem in health care settings. Decision to initiate correct antibiotic therapy requires rapid identification and quantification of organism. The aim of this study was to develop a rapid and sensitive method for direct detection of A. baumannii from respiratory specimens. Materials and Methods A Taqman real time PCR based on the sequence of bla oxa-51 was designed and used for direct detection of A. baumannii from 361 respiratory specimens of patients with pneumonia. All specimens were checked by conventional bacteriology in parallel. Results The new real time PCR could detect less than 200 cfu per ml of bacteria in specimens. There was agreement between the results of real time PCR and culture (Kappa value 1.0, p value<0.001). The sensitivity, specificity and predictive values of real time PCR were 100%. The prevalence of A. baumannii in pneumonia patients was 10.53 % (n=38). Poly-microbial infections were detected in 65.71% of specimens. Conclusion Acinetobacter baumannii is the third causative agent in nosocomial pneumonia after Pseudomonas aeroginosa (16%) and Staphylococcus aureus (13%) at Tehran hospitals. We recommend that 104 CFU be the threshold for definition of infection with A. baumannii using real time PCR. PMID:22530083

  2. A System for Reflective Learning Using Handwriting Tablet Devices for Real-Time Event Bookmarking into Simultaneously Recorded Videos

    ERIC Educational Resources Information Center

    Nakajima, Taira

    2012-01-01

    The author demonstrates a new system useful for reflective learning. Our new system offers an environment that one can use handwriting tablet devices to bookmark symbolic and descriptive feedbacks into simultaneously recorded videos in the environment. If one uses video recording and feedback check sheets in reflective learning sessions, one can…

  3. A Pro-active Real-time Forecasting and Decision Support System for Daily Management of Marine Works

    NASA Astrophysics Data System (ADS)

    Bollen, Mark; Leyssen, Gert; Smets, Steven; De Wachter, Tom

    2016-04-01

    Marine Works involving turbidity generating activities (eg. dredging, dredge spoil placement) can generate environmental stress in and around a project area in the form of sediment plumes causing light reduction and sedimentation. If these works are situated near sensitive habitats like sea-grass beds, coral reefs or sensitive human activities eg. aquaculture farms or water intakes, or if contaminants are present in the water soil environmental scrutiny is advised. Environmental Regulations can impose limitations to these activities in the form of turbidity thresholds, spill budgets, contaminant levels. Breaching environmental regulations can result in increased monitoring, adaptation of the works planning and production rates and ultimately in a (temporary) stop of activities all of which entail time and cost impacts for a contractor and/or client. Sediment plume behaviour is governed by the dredging process, soil properties and ambient conditions (currents, water depth) and can be modelled. Usually this is done during the preparatory EIA phase of a project, for estimation of environmental impact based on climatic scenarios. An operational forecasting tool is developed to adapt marine work schedules to the real-time circumstances and thus evade exceedance of critical threshold levels at sensitive areas. The forecasting system is based on a Python-based workflow manager with a MySQL database and a Django frontend web tool for user interaction and visualisation of the model results. The core consists of a numerical hydrodynamic model with sediment transport module (Mike21 from DHI). This model is driven by space and time varying wind fields and wave boundary conditions, and turbidity inputs (suspended sediment source terms) based on marine works production rates and soil properties. The resulting threshold analysis allows the operator to indicate potential impact at the sensitive areas and instigate an adaption of the marine work schedule if needed. In order to use this toolbox in real-time situations and facilitate forecasting of impacts of planned dredge works, the following operational online functionalities are implemented: • Automated fetch and preparation of the input data, including 7 day forecast wind and wave fields and real-time measurements, and user defined the turbidity inputs based on scheduled marine works. • Generate automated forecasts and running user configurable scenarios at the same time in parallel. • Export and convert the model results, time series and maps, into a standardized format (netcdf). • Automatic analysis and processing of model results, including the calculation of indicator turbidity values and the exceedance analysis of threshold levels at the different sensitive areas. Data assimilation with the real time on site turbidity measurements is implemented in this threshold analysis. • Pre-programmed generation of animated sediment plumes, specific charts and pdf reports to allow a rapid interpretation of the model results by the operators and facilitating decision making in the operational planning. The performed marine works, resulting from the marine work schedule proposed by the forecasting system, are evaluated by a threshold analysis on the validated turbidity measurements on the sensitive sites. This machine learning loop allows a check of the system in order to evaluate forecast and model uncertainties.

  4. The 1887 earthquake and tsunami in the Ligurian Sea: analysis of coastal effects studied by numerical modeling and prototype for real-time computing

    NASA Astrophysics Data System (ADS)

    Monnier, Angélique; Gailler, Audrey; Loevenbruck, Anne; Heinrich, Philippe; Hébert, Hélène

    2017-04-01

    The February 1887 earthquake in Italy (Imperia) triggered a tsunami well observed on the French and Italian coastlines. Tsunami waves were recorded on a tide gauge in the Genoa harbour with a small, recently reappraised maximum amplitude of about 10-12 cm (crest-to-trough). The magnitude of the earthquake is still debated in the recent literature, and discussed according to available macroseismic, tectonic and tsunami data. While the tsunami waveform observed in the Genoa harbour may be well explained with a magnitude smaller than 6.5 (Hébert et al., EGU 2015), we investigate in this study whether such source models are consistent with the tsunami effects reported elsewhere along the coastline. The idea is to take the opportunity of the fine bathymetric data recently synthetized for the French Tsunami Warning Center (CENALT) to test the 1887 source parameters using refined, nested grid tsunami numerical modeling down to the harbour scale. Several source parameters are investigated to provide a series of models accounting for various magnitudes and mechanisms. This allows us to compute the tsunami effects for several coastal sites in France (Nice, Villefranche, Antibes, Mandelieu, Cannes) and to compare with observations. Meanwhile we also check the computing time of the chosen scenarios to study whether running nested grids simulation in real time can be suitable in operational context in term of computational cost for these Ligurian scenarios. This work is supported by the FP7 ASTARTE project (Assessment Strategy and Risk Reduction for Tsunamis in Europe, grant 603839 FP7) and by the French PIA TANDEM (Tsunamis in the Atlantic and English ChaNnel: Definition of the Effects through Modeling) project (grant ANR-11-RSNR-00023).

  5. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another

  6. Small catchments DEM creation using Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Gafurov, A. M.

    2018-01-01

    Digital elevation models (DEM) are an important source of information on the terrain, allowing researchers to evaluate various exogenous processes. The higher the accuracy of DEM the better the level of the work possible. An important source of data for the construction of DEMs are point clouds obtained with terrestrial laser scanning (TLS) and unmanned aerial vehicles (UAV). In this paper, we present the results of constructing a DEM on small catchments using UAVs. Estimation of the UAV DEM showed comparable accuracy with the TLS if real time kinematic Global Positioning System (RTK-GPS) ground control points (GCPs) and check points (CPs) were used. In this case, the main source of errors in the construction of DEMs are the errors in the referencing of survey results.

  7. From Care to Cure: Demonstrating a Model of Clinical Patient Navigation for Hepatitis C Care and Treatment in High-Need Patients.

    PubMed

    Ford, Mary M; Johnson, Nirah; Desai, Payal; Rude, Eric; Laraque, Fabienne

    2017-03-01

    The NYC Department of Health implemented a patient navigation program, Check Hep C, to address patient and provider barriers to HCV care and potentially lifesaving treatment. Services were delivered at two clinical care sites and two sites that linked patients to off-site care. Working with a multidisciplinary care team, patient navigators provided risk assessment, health education, treatment readiness and medication adherence counseling, and medication coordination. Between March 2014 and January 2015, 388 participants enrolled in Check Hep C, 129 (33%) initiated treatment, and 119 (91% of initiators) had sustained virologic response (SVR). Participants receiving on-site clinical care had higher odds of initiating treatment than those linked to off-site care. Check Hep C successfully supported high-need participants through HCV care and treatment, and SVR rates demonstrate the real-world ability of achieving high cure rates using patient navigation care models. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Real time detection of farm-level swine mycobacteriosis outbreak using time series modeling of the number of condemned intestines in abattoirs.

    PubMed

    Adachi, Yasumoto; Makita, Kohei

    2015-09-01

    Mycobacteriosis in swine is a common zoonosis found in abattoirs during meat inspections, and the veterinary authority is expected to inform the producer for corrective actions when an outbreak is detected. The expected value of the number of condemned carcasses due to mycobacteriosis therefore would be a useful threshold to detect an outbreak, and the present study aims to develop such an expected value through time series modeling. The model was developed using eight years of inspection data (2003 to 2010) obtained at 2 abattoirs of the Higashi-Mokoto Meat Inspection Center, Japan. The resulting model was validated by comparing the predicted time-dependent values for the subsequent 2 years with the actual data for 2 years between 2011 and 2012. For the modeling, at first, periodicities were checked using Fast Fourier Transformation, and the ensemble average profiles for weekly periodicities were calculated. An Auto-Regressive Integrated Moving Average (ARIMA) model was fitted to the residual of the ensemble average on the basis of minimum Akaike's information criterion (AIC). The sum of the ARIMA model and the weekly ensemble average was regarded as the time-dependent expected value. During 2011 and 2012, the number of whole or partial condemned carcasses exceeded the 95% confidence interval of the predicted values 20 times. All of these events were associated with the slaughtering of pigs from three producers with the highest rate of condemnation due to mycobacteriosis.

  9. A New Minimum Trees-Based Approach for Shape Matching with Improved Time Computing: Application to Graphical Symbols Recognition

    NASA Astrophysics Data System (ADS)

    Franco, Patrick; Ogier, Jean-Marc; Loonis, Pierre; Mullot, Rémy

    Recently we have developed a model for shape description and matching. Based on minimum spanning trees construction and specifics stages like the mixture, it seems to have many desirable properties. Recognition invariance in front shift, rotated and noisy shape was checked through median scale tests related to GREC symbol reference database. Even if extracting the topology of a shape by mapping the shortest path connecting all the pixels seems to be powerful, the construction of graph induces an expensive algorithmic cost. In this article we discuss on the ways to reduce time computing. An alternative solution based on image compression concepts is provided and evaluated. The model no longer operates in the image space but in a compact space, namely the Discrete Cosine space. The use of block discrete cosine transform is discussed and justified. The experimental results led on the GREC2003 database show that the proposed method is characterized by a good discrimination power, a real robustness to noise with an acceptable time computing.

  10. NASTRAN data generation of helicopter fuselages using interactive graphics. [preprocessor system for finite element analysis using IBM computer

    NASA Technical Reports Server (NTRS)

    Sainsbury-Carter, J. B.; Conaway, J. H.

    1973-01-01

    The development and implementation of a preprocessor system for the finite element analysis of helicopter fuselages is described. The system utilizes interactive graphics for the generation, display, and editing of NASTRAN data for fuselage models. It is operated from an IBM 2250 cathode ray tube (CRT) console driven by an IBM 370/145 computer. Real time interaction plus automatic data generation reduces the nominal 6 to 10 week time for manual generation and checking of data to a few days. The interactive graphics system consists of a series of satellite programs operated from a central NASTRAN Systems Monitor. Fuselage structural models including the outer shell and internal structure may be rapidly generated. All numbering systems are automatically assigned. Hard copy plots of the model labeled with GRID or elements ID's are also available. General purpose programs for displaying and editing NASTRAN data are included in the system. Utilization of the NASTRAN interactive graphics system has made possible the multiple finite element analysis of complex helicopter fuselage structures within design schedules.

  11. Robust Linear Models for Cis-eQTL Analysis.

    PubMed

    Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C

    2015-01-01

    Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.

  12. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 89.1% were detected by the SCED method within 2°. Based on the type of check that detected the error, determination of error sources was achieved. With noise ranging from no random noise to four times the established noise value, the averaged relevant dose error detection rate of the SCED method was between 94.0% and 95.8% and that of gamma between 82.8% and 89.8%. An EPID-frame-based error detection process for VMAT deliveries was successfully designed and tested via simulations. The SCED method was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of relevant dose errors. Compared to a typical (3%, 3 mm) gamma analysis, the SCED method produced a higher detection rate for all introduced dose errors, identified errors in an earlier stage, displayed a higher robustness to noise variations, and indicated the error source. © 2017 American Association of Physicists in Medicine.

  13. Bearing tester data compilation, analysis, and reporting and bearing math modeling

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A test condition data base was developed for the Bearing and Seal Materials Tester (BSMT) program which permits rapid retrieval of test data for trend analysis and evaluation. A model was developed for the Space shuttle Main Engine (SSME) Liquid Oxygen (LOX) turbopump shaft/bearing system. The model was used to perform parametric analyses to determine the sensitivity of bearing operating characteristics and temperatures to variations in: axial preload, contact friction, coolant flow and subcooling, heat transfer coefficients, outer race misalignments, and outer race to isolator clearances. The bearing program ADORE (Advanced Dynamics of Rolling Elements) was installed on the UNIVAC 1100/80 computer system and is operational. ADORE is an advanced FORTRAN computer program for the real time simulation of the dynamic performance of rolling bearings. A model of the 57 mm turbine-end bearing is currently being checked out. Analyses were conducted to estimate flow work energy for several flow diverter configurations and coolant flow rates for the LOX BSMT.

  14. 75 FR 43801 - Airworthiness Directives; Eurocopter France (ECF) Model EC225LP Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... time. Also, we use inspect rather than check when referring to an action required by a mechanic as... the various levels of government. Therefore, I certify this AD: 1. Is not a ``significant regulatory... compliance time. Also, we use inspect rather than check when referring to an action required by a mechanic as...

  15. Cooperative multi-user detection and ranging based on pseudo-random codes

    NASA Astrophysics Data System (ADS)

    Morhart, C.; Biebl, E. M.

    2009-05-01

    We present an improved approach for a Round Trip Time of Flight distance measurement system. The system is intended for the usage in a cooperative localisation system for automotive applications. Therefore, it is designed to address a large number of communication partners per measurement cycle. By using coded signals in a time divison multiple access order, we can detect a large number of pedestrian sensors with just one car sensor. We achieve this by using very short transmit bursts in combination with a real time correlation algorithm. Futhermore, the correlation approach offers real time data, concerning the time of arrival, that can serve as a trigger impulse for other comunication systems. The distance accuracy of the correlation result was further increased by adding a fourier interpolation filter. The system performance was checked with a prototype at 2.4 GHz. We reached a distance measurement accuracy of 12 cm at a range up to 450 m.

  16. Protecting quantum memories using coherent parity check codes

    NASA Astrophysics Data System (ADS)

    Roffe, Joschka; Headley, David; Chancellor, Nicholas; Horsman, Dominic; Kendon, Viv

    2018-07-01

    Coherent parity check (CPC) codes are a new framework for the construction of quantum error correction codes that encode multiple qubits per logical block. CPC codes have a canonical structure involving successive rounds of bit and phase parity checks, supplemented by cross-checks to fix the code distance. In this paper, we provide a detailed introduction to CPC codes using conventional quantum circuit notation. We demonstrate the implementation of a CPC code on real hardware, by designing a [[4, 2, 2

  17. Stability analysis for a multi-camera photogrammetric system.

    PubMed

    Habib, Ayman; Detchev, Ivan; Kwak, Eunju

    2014-08-18

    Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.

  18. Stability Analysis for a Multi-Camera Photogrammetric System

    PubMed Central

    Habib, Ayman; Detchev, Ivan; Kwak, Eunju

    2014-01-01

    Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction. PMID:25196012

  19. Investigation of energy transport within a pulse tube

    NASA Astrophysics Data System (ADS)

    Waldauf, A.; Schmauder, T.; Thürk, M.; Seidel, P.

    2002-05-01

    A compact Four-Valve Pulse Tube Refrigerator (FVPTR) in U-tube configuration without a reservoir has been built. At present, the cooler provides a minimum temperature of 32 K and 100 W of cooling power at 90 K with a nominal input power of 5.6 kW. Experiments were performed to study the special refrigeration mechanisms of the FVPTR. The highly instrumented system that includes gas temperature sensors, hot wire anemometers and pressure sensors is used to assess the p-V work and enthalpy flow at the key locations in the pulse tube. The experiments have enabled us to verify the various analytical models of the FVPTR. Based on the first law of thermodynamics for open systems we have estimated the gross refrigeration power for this special type of pulse tube refrigerator. Furthermore our model takes typical loss processes into consideration to analyze the real FVPTR process. These calculations need some assumptions about the real flow behavior and the time-dependent temperatures within the pulse tube. The accuracy of these assumptions will be checked by our experiments. By using these results a further technical improvement of our FVPTR should be possible.

  20. Simple Schlieren Light Meter

    NASA Technical Reports Server (NTRS)

    Rhodes, David B.; Franke, John M.; Jones, Stephen B.; Leighty, Bradley D.

    1992-01-01

    Simple light-meter circuit used to position knife edge of schlieren optical system to block exactly half light. Enables operator to check quickly position of knife edge between tunnel runs to ascertain whether or not in alignment. Permanent measuring system made part of each schlieren system. If placed in unused area of image plane, or in monitoring beam from mirror knife edge, provides real-time assessment of alignment of schlieren system.

  1. Design and analysis of surface plasmon resonance (SPR) sensor to check the quality of food from adulteration

    NASA Astrophysics Data System (ADS)

    Kumar, Manish; Raghuwanshi, Sanjeev Kumar

    2018-02-01

    In recent years, food safety issues caused by contamination of chemical substances or microbial species have raised a major area of concern to mankind. The conventional chromatography-based methods for detection of chemical are based on human-observation and slow for real-time monitoring. The surface plasmon resonance (SPR) sensors offers the capability of detection of very low concentrations of adulterated chemical and biological agents for real-time by monitoring. Thus, adulterant agent in food gives change in refractive index of pure food result in corresponding phase change. These changes can be detected at the output and can be related to the concentration of the chemical species present at the point.

  2. Diagnostics in the Extendable Integrated Support Environment (EISE)

    NASA Technical Reports Server (NTRS)

    Brink, James R.; Storey, Paul

    1988-01-01

    Extendable Integrated Support Environment (EISE) is a real-time computer network consisting of commercially available hardware and software components to support systems level integration, modifications, and enhancement to weapons systems. The EISE approach offers substantial potential savings by eliminating unique support environments in favor of sharing common modules for the support of operational weapon systems. An expert system is being developed that will help support diagnosing faults in this network. This is a multi-level, multi-expert diagnostic system that uses experiential knowledge relating symptoms to faults and also reasons from structural and functional models of the underlying physical model when experiential reasoning is inadequate. The individual expert systems are orchestrated by a supervisory reasoning controller, a meta-level reasoner which plans the sequence of reasoning steps to solve the given specific problem. The overall system, termed the Diagnostic Executive, accesses systems level performance checks and error reports, and issues remote test procedures to formulate and confirm fault hypotheses.

  3. AMFESYS: Modelling and diagnosis functions for operations support

    NASA Technical Reports Server (NTRS)

    Wheadon, J.

    1993-01-01

    Packetized telemetry, combined with low station coverage for close-earth satellites, may introduce new problems in presenting to the operator a clear picture of what the spacecraft is doing. A recent ESOC study has gone some way to show, by means of a practical demonstration, how the use of subsystem models combined with artificial intelligence techniques, within a real-time spacecraft control system (SCS), can help to overcome these problems. A spin-off from using these techniques can be an improvement in the reliability of the telemetry (TM) limit-checking function, as well as the telecommand verification function, of the Spacecraft Control systems (SCS). The problem and how it was addressed, including an overview of the 'AMF Expert System' prototype are described, and proposes further work which needs to be done to prove the concept. The Automatic Mirror Furnace is part of the payload of the European Retrievable Carrier (EURECA) spacecraft, which was launched in July 1992.

  4. Detection of Local Temperature Change on HTS Cables via Time-Frequency Domain Reflectometry

    NASA Astrophysics Data System (ADS)

    Bang, Su Sik; Lee, Geon Seok; Kwon, Gu-Young; Lee, Yeong Ho; Ji, Gyeong Hwan; Sohn, Songho; Park, Kijun; Shin, Yong-June

    2017-07-01

    High temperature superconducting (HTS) cables are drawing attention as transmission and distribution cables in future grid, and related researches on HTS cables have been conducted actively. As HTS cables have come to the demonstration stage, failures of cooling systems inducing quench phenomenon of the HTS cables have become significant. Several diagnosis of the HTS cables have been developed but there are still some limitations of the experimental setup. In this paper, a non-destructive diagnostic technique for the detection of the local temperature change point is proposed. Also, a simulation model of HTS cables with a local temperature change point is suggested to verify the proposed diagnosis. The performance of the diagnosis is checked by comparative analysis between the proposed simulation results and experiment results of a real-world HTS cable. It is expected that the suggested simulation model and diagnosis will contribute to the commercialization of HTS cables in the power grid.

  5. Preprocessing for Eddy Dissipation Rate and TKE Profile Generation

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen; Rodgers, William G., Jr.; McKissick, Burnell T. (Technical Monitor)

    2001-01-01

    The Aircraft Vortex Spacing System (AVOSS), a set of algorithms to determine aircraft spacing according to wake vortex behavior prediction, requires turbulence profiles to appropriately determine arrival and departure aircraft spacing. The ambient atmospheric turbulence profile must always be produced, even if the result is an arbitrary (canned) profile. The original turbulence profile code was generated By North Carolina State University and used in a non-real-time environment in the past. All the input parameters could be carefully selected and screened prior to input. Since this code must run in real-time using actual measurements in the field as input, it became imperative to begin a data checking and screening process as part of the real-time implementation. The process described herein is a step towards ensuring that the best possible turbulence profile is always provided to AVOSS. Data fill-ins, constant profiles and arbitrary profiles are used only as a last resort, but are essential to ensure uninterrupted application of AVOSS.

  6. Semantic Importance Sampling for Statistical Model Checking

    DTIC Science & Technology

    2014-10-18

    we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with rare events. Our results indicate that SIS reduces...background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In Section 6, we present our experiments and results...Syntactic Extraction ∗( ) dReal + Refinement ∗ |∗| , Monte-Carlo , Fig. 5. Architecture of osmosis

  7. Long-term real-time structural health monitoring using wireless smart sensor

    NASA Astrophysics Data System (ADS)

    Jang, Shinae; Mensah-Bonsu, Priscilla O.; Li, Jingcheng; Dahal, Sushil

    2013-04-01

    Improving the safety and security of civil infrastructure has become a critical issue for decades since it plays a central role in the economics and politics of a modern society. Structural health monitoring of civil infrastructure using wireless smart sensor network has emerged as a promising solution recently to increase structural reliability, enhance inspection quality, and reduce maintenance costs. Though hardware and software framework are well prepared for wireless smart sensors, the long-term real-time health monitoring strategy are still not available due to the lack of systematic interface. In this paper, the Imote2 smart sensor platform is employed, and a graphical user interface for the long-term real-time structural health monitoring has been developed based on Matlab for the Imote2 platform. This computer-aided engineering platform enables the control, visualization of measured data as well as safety alarm feature based on modal property fluctuation. A new decision making strategy to check the safety is also developed and integrated in this software. Laboratory validation of the computer aided engineering platform for the Imote2 on a truss bridge and a building structure has shown the potential of the interface for long-term real-time structural health monitoring.

  8. JPL/USC GAIM: Validating COSMIC and Ground-Based GPS Assimilation Results to Estimate Ionospheric Electron Densities

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Wilson, B.; Akopian, V.; Pi, X.; Mannucci, A.; Wang, C.

    2008-12-01

    We seem to be in the midst of a revolution in ionospheric remote sensing driven by the abundance of ground and space-based GPS receivers, new UV remote sensing satellites, and the advent of data assimilation techniques for space weather. In particular, the COSMIC 6-satellite constellation was launched in April 2006. COSMIC now provides unprecedented global coverage of GPS occultations measurements, each of which yields electron density information with unprecedented ~1 km vertical resolution. Calibrated measurements of ionospheric delay (total electron content or TEC) suitable for input into assimilation models is currently made available in near real-time (NRT) from the COSMIC with a latency of 30 to 120 minutes. The University of Southern California (USC) and the Jet Propulsion Laboratory (JPL) have jointly developed a real-time Global Assimilative Ionospheric Model (GAIM) to monitor space weather, study storm effects, and provide ionospheric calibration for DoD customers and NASA flight projects. JPL/USC GAIM is a physics- based 3D data assimilation model that uses both 4DVAR and Kalman filter techniques to solve for the ion and electron density state and key drivers such as equatorial electrodynamics, neutral winds, and production terms. Daily (delayed) GAIM runs can accept as input ground GPS TEC data from 1200+ sites, occultation links from CHAMP, SAC-C, and the COSMIC constellation, UV limb and nadir scans from the TIMED and DMSP satellites, and in situ data from a variety of satellites (DMSP and C/NOFS). Real-Time GAIM (RTGAIM) ingests multiple data sources in real time, updates the 3D electron density grid every 5 minutes, and solves for improved drivers every 1-2 hours. Since our forward physics model and the adjoint model were expressly designed for data assimilation and computational efficiency, all of this can be accomplished on a single dual- processor Unix workstation. Customers are currently evaluating the accuracy of JPL/USC GAIM 'nowcasts' for ray tracing applications and trans-ionospheric path delay calibration. In the presentation, we will discuss the expected impact of NRT COSMIC occultation and NRT ground-based measurements and present validation results for ingest of COSMIC data into GAIM using measurements from World Days. We will quality check our COSMIC-derived products by comparing Abel profiles and JPL- processed results. Furthermore, we will validate GAIM assimilation results using Incoherent Scatter Radar measurements from Arecibo, Jicamarca and Millstone Hill datasets. We will conclude by characterizing the improved electron density states using dual-frequency altimeter-derived Jason vertical TEC measurements.

  9. Using inferential sensors for quality control of Everglades Depth Estimation Network water-level data

    USGS Publications Warehouse

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The Everglades Depth Estimation Network (EDEN), with over 240 real-time gaging stations, provides hydrologic data for freshwater and tidal areas of the Everglades. These data are used to generate daily water-level and water-depth maps of the Everglades that are used to assess biotic responses to hydrologic change resulting from the U.S. Army Corps of Engineers Comprehensive Everglades Restoration Plan. The generation of EDEN daily water-level and water-depth maps is dependent on high quality real-time data from water-level stations. Real-time data are automatically checked for outliers by assigning minimum and maximum thresholds for each station. Small errors in the real-time data, such as gradual drift of malfunctioning pressure transducers, are more difficult to immediately identify with visual inspection of time-series plots and may only be identified during on-site inspections of the stations. Correcting these small errors in the data often is time consuming and water-level data may not be finalized for several months. To provide daily water-level and water-depth maps on a near real-time basis, EDEN needed an automated process to identify errors in water-level data and to provide estimates for missing or erroneous water-level data.The Automated Data Assurance and Management (ADAM) software uses inferential sensor technology often used in industrial applications. Rather than installing a redundant sensor to measure a process, such as an additional water-level station, inferential sensors, or virtual sensors, were developed for each station that make accurate estimates of the process measured by the hard sensor (water-level gaging station). The inferential sensors in the ADAM software are empirical models that use inputs from one or more proximal stations. The advantage of ADAM is that it provides a redundant signal to the sensor in the field without the environmental threats associated with field conditions at stations (flood or hurricane, for example). In the event that a station does malfunction, ADAM provides an accurate estimate for the period of missing data. The ADAM software also is used in the quality assurance and quality control of the data. The virtual signals are compared to the real-time data, and if the difference between the two signals exceeds a certain tolerance, corrective action to the data and (or) the gaging station can be taken. The ADAM software is automated so that, each morning, the real-time EDEN data are compared to the inferential sensor signals and digital reports highlighting potential erroneous real-time data are generated for appropriate support personnel. The development and application of inferential sensors is easily transferable to other real-time hydrologic monitoring networks.

  10. Technical note: A new wedge-shaped ionization chamber component module for BEAMnrc to model the integral quality monitoring system®

    NASA Astrophysics Data System (ADS)

    Oderinde, Oluwaseyi Michael; du Plessis, FCP

    2017-12-01

    The purpose of this study was to develop a new component module (CM) namely IQM to accurately model the integral quality monitoring (IQM) system® to be used in the BEAMnrc Monte Carlo (MC) code. The IQM is essentially a double wedge ionization chamber with the central electrode plate bisecting the wedge. The IQM CM allows the user to characterize the double wedge of this ionization chamber and BEAMnrc can then accurately calculate the dose in this CM including its enclosed air regions. This has been verified against measured data. The newly created CM was added into the standard BEAMnrc CMs, and it will be made available through the NRCC website. The BEAMnrc graphical user interface (GUI) and particle ray-tracing techniques were used to validate the IQM geometry. In subsequent MC simulations, the dose scored in the IQM was verified against measured data over a range of square fields ranging from 1 × 1-30 × 30 cm2. The IQM system is designed for the present day need for a device that could verify beam output in real-time during treatment. This CM is authentic, and it can serve as a basis for researchers that have an interest in real-time beam delivery checking using wedge-shaped ionization chamber based instruments like the IQM.

  11. TerraSAR-X precise orbit determination with real-time GPS ephemerides

    NASA Astrophysics Data System (ADS)

    Wermuth, Martin; Hauschild, Andre; Montenbruck, Oliver; Kahle, Ralph

    TerraSAR-X is a German Synthetic Aperture Radar (SAR) satellite, which was launched in June 2007 from Baikonour. Its task is to acquire radar images of the Earth's surface. In order to locate the radar data takes precisely, the satellite is equipped with a high-quality dual-frequency GPS receiver -the Integrated Geodetic and Occultation Receiver (IGOR) provided by the GeoForschungsZentrum Potsdam (GFZ). Using GPS observations from the IGOR instrument in a reduced dynamic precise orbit determination (POD), the German Space Operations Center (DLR/GSOC) is computing rapid and science orbit products on a routine basis. The rapid orbit products arrive with a latency of about one hour after data reception with an accuracy of 10-20 cm. Science orbit products are computed with a latency of five days achieving an accuracy of about 5cm (3D-RMS). For active and future Earth observation missions, the availability of near real-time precise orbit information is becoming more and more important. Other applications of near real-time orbit products include the processing of GNSS radio occulation measurements for atmospheric sounding as well as altimeter measurements of ocean surface heights, which are nowadays employed in global weather and ocean circulation models with short latencies. For example after natural disasters it is necessary to evaluate the damage by satellite images as soon as possible. The latency and quality of POD results is mainly driven by the availability of precise GPS ephemerides. In order to have high-quality GPS ephemerides available at real-time, GSOC has developed the real-time clock estimation system RETICLE. The system receives NTRIP-data streams with GNSS observations from the global tracking network of IGS in real-time. Using the known station position, RETICLE estimates precise GPS satellite clock offsets and drifts based on the most recent available IGU predicted orbits. The clock offset estimates have an accuracy of better than 0.3 ns and are globally valid. The latency of the estimated clocks is approximately 7 seconds. Another limiting factor is the frequency of satellite downlinks and the latency of the data transfer from the ground station to the computation center. Therefore a near real-time scenario is examined in which the satellite has about one ground station contact per orbit or respectively one contact in 90 minutes. The results of the near real-time POD are evaluated in an internal consistency check and compared against the science orbit solution and laser ranging observations.

  12. Quantification of Campylobacter spp. in pig feces by direct real-time PCR with an internal control of extraction and amplification.

    PubMed

    Leblanc-Maridor, Mily; Garénaux, Amélie; Beaudeau, François; Chidaine, Bérangère; Seegers, Henri; Denis, Martine; Belloc, Catherine

    2011-04-01

    The rapid and direct quantification of Campylobacter spp. in complex substrates like feces or environmental samples is crucial to facilitate epidemiological studies on Campylobacter in pig production systems. We developed a real-time PCR assay for detecting and quantifying Campylobacter spp. directly in pig feces with the use of an internal control. Campylobacter spp. and Yersinia ruckeri primers-probes sets were designed and checked for specificity with diverse Campylobacter, related organisms, and other bacterial pathogens before being used in field samples. The quantification of Campylobacter spp. by the real-time PCR then was realized on 531 fecal samples obtained from experimentally and naturally infected pigs; the numeration of Campylobacter on Karmali plate was done in parallel. Yersinia ruckeri, used as bacterial internal control, was added to the samples before DNA extraction to control DNA-extraction and PCR-amplification. The sensitivity of the PCR assay was 10 genome copies. The established Campylobacter real-time PCR assay showed a 7-log-wide linear dynamic range of quantification (R²=0.99) with a detection limit of 200 Colony Forming Units of Campylobacter per gram of feces. A high correlation was found between the results obtained by real-time PCR and those by culture at both qualitative and quantitative levels. Moreover, DNA extraction followed by real-time PCR reduced the time needed for analysis to a few hours (within a working day). In conclusion, the real-time PCR developed in this study provides new tools for further epidemiological surveys to investigate the carriage and excretion of Campylobacter by pigs. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Population pharmacokinetics of tacrolimus in paediatric systemic lupus erythematosus based on real-world study.

    PubMed

    Wang, D-D; Lu, J-M; Li, Q; Li, Z-P

    2018-05-15

    Different population pharmacokinetics (PPK) models of tacrolimus have been established in various populations. However, the tacrolimus PPK model in paediatric systemic lupus erythematosus (PSLE) is still undefined. This study aimed to establish the tacrolimus PPK model in Chinese PSLE. A total of nineteen Chinese patients with PSLE from real-world study were characterized with nonlinear mixed-effects modelling (NONMEM). The impact of demographic features, biological characteristics, and concomitant medications was evaluated. Model validation was assessed by bootstrap and prediction-corrected visual predictive check (VPC). A one-compartment model with first-order absorption and elimination was determined to be the most suitable model in PSLE. The typical values of apparent oral clearance (CL/F) and the apparent volume of distribution (V/F) in the final model were 2.05 L/h and 309 L, respectively. Methylprednisolone and simvastatin were included as significant. The first validated tacrolimus PPK model in patients with PSLE is presented. © 2018 John Wiley & Sons Ltd.

  14. Credit Card Fraud Detection: A Realistic Modeling and a Novel Learning Strategy.

    PubMed

    Dal Pozzolo, Andrea; Boracchi, Giacomo; Caelen, Olivier; Alippi, Cesare; Bontempi, Gianluca

    2017-09-14

    Detecting frauds in credit card transactions is perhaps one of the best testbeds for computational intelligence algorithms. In fact, this problem involves a number of relevant challenges, namely: concept drift (customers' habits evolve and fraudsters change their strategies over time), class imbalance (genuine transactions far outnumber frauds), and verification latency (only a small set of transactions are timely checked by investigators). However, the vast majority of learning algorithms that have been proposed for fraud detection rely on assumptions that hardly hold in a real-world fraud-detection system (FDS). This lack of realism concerns two main aspects: 1) the way and timing with which supervised information is provided and 2) the measures used to assess fraud-detection performance. This paper has three major contributions. First, we propose, with the help of our industrial partner, a formalization of the fraud-detection problem that realistically describes the operating conditions of FDSs that everyday analyze massive streams of credit card transactions. We also illustrate the most appropriate performance measures to be used for fraud-detection purposes. Second, we design and assess a novel learning strategy that effectively addresses class imbalance, concept drift, and verification latency. Third, in our experiments, we demonstrate the impact of class unbalance and concept drift in a real-world data stream containing more than 75 million transactions, authorized over a time window of three years.

  15. Sediment trapping efficiency of adjustable check dam in laboratory and field experiment

    NASA Astrophysics Data System (ADS)

    Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui

    2014-05-01

    Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.

  16. The good, the bad and the dubious: VHELIBS, a validation helper for ligands and binding sites

    PubMed Central

    2013-01-01

    Background Many Protein Data Bank (PDB) users assume that the deposited structural models are of high quality but forget that these models are derived from the interpretation of experimental data. The accuracy of atom coordinates is not homogeneous between models or throughout the same model. To avoid basing a research project on a flawed model, we present a tool for assessing the quality of ligands and binding sites in crystallographic models from the PDB. Results The Validation HElper for LIgands and Binding Sites (VHELIBS) is software that aims to ease the validation of binding site and ligand coordinates for non-crystallographers (i.e., users with little or no crystallography knowledge). Using a convenient graphical user interface, it allows one to check how ligand and binding site coordinates fit to the electron density map. VHELIBS can use models from either the PDB or the PDB_REDO databank of re-refined and re-built crystallographic models. The user can specify threshold values for a series of properties related to the fit of coordinates to electron density (Real Space R, Real Space Correlation Coefficient and average occupancy are used by default). VHELIBS will automatically classify residues and ligands as Good, Dubious or Bad based on the specified limits. The user is also able to visually check the quality of the fit of residues and ligands to the electron density map and reclassify them if needed. Conclusions VHELIBS allows inexperienced users to examine the binding site and the ligand coordinates in relation to the experimental data. This is an important step to evaluate models for their fitness for drug discovery purposes such as structure-based pharmacophore development and protein-ligand docking experiments. PMID:23895374

  17. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  18. 76 FR 477 - Airworthiness Directives; Bombardier, Inc. Model CL-600-2A12 (CL-601) and CL-600-2B16 (CL-601-3A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... to these aircraft if Bombardier Service Bulletin (SB) 601-0590 [Scheduled Maintenance Instructions... information: Challenger 601 Time Limits/Maintenance Checks, PSP 601-5, Revision 38, dated June 19, 2009. Challenger 601 Time Limits/Maintenance Checks, PSP 601A-5, Revision 34, dated June 19, 2009. Challenger 604...

  19. Bounded Parametric Model Checking for Elementary Net Systems

    NASA Astrophysics Data System (ADS)

    Knapik, Michał; Szreter, Maciej; Penczek, Wojciech

    Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.

  20. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  1. Real-time supervisor system based on trinary logic to control experiments with behaving animals and humans.

    PubMed

    Kutz, D F; Marzocchi, N; Fattori, P; Cavalcanti, S; Galletti, C

    2005-06-01

    A new method is presented based on trinary logic able to check the state of different control variables and synchronously record the physiological and behavioral data of behaving animals and humans. The basic information structure of the method is a time interval of defined maximum duration, called time slice, during which the supervisor system periodically checks the status of a specific subset of input channels. An experimental condition is a sequence of time slices subsequently executed according to the final status of the previous time slice. The proposed method implements in its data structure the possibility to branch like an if-else cascade and the possibility to repeat parts of it recursively like the while-loop. Therefore its data structure contains the most basic control structures of programming languages. The method was implemented using a real-time version of LabVIEW programming environment to program and control our experimental setup. Using this supervision system, we synchronously record four analog data channels at 500 Hz (including eye movements) and the time stamps of up to six neurons at 100 kHz. The system reacts with a resolution within 1 ms to changes of state of digital input channels. The system is set to react to changes in eye position with a resolution within 4 ms. The time slices, experimental conditions, and data are handled by relational databases. This facilitates the construction of new experimental conditions and data analysis. The proposed implementation allows continuous recording without an inter-trial gap for data storage or task management. The implementation can be used to drive electrophysiological experiments of behaving animals and psychophysical studies with human subjects.

  2. The Worm Propagation Model with Dual Dynamic Quarantine Strategy

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Xie, Xiao-Wu; Guo, Hao; Gao, Fu-Xiang; Yu, Ge

    Internet worms are becoming more and more harmful with the rapid development of the Internet. Due to the extremely fast spread and great destructive power of network worms, strong dynamic quarantine strategies are necessary. Inspired by the real-world approach to the prevention and treatment of infectious diseases, this paper proposes a quarantine strategy based on dynamic worm propagation model: the SIQRV dual quarantine model. This strategy uses dynamic quarantine method to make the vulnerable host and infected host quarantined, and then release them after a certain period of time, regardless of whether quarantined host security is checked. Through mathematic modeling, it has been found that when the basic reproduction number R0 is less than a critical value, the system will stabilize in the disease-free equilibrium, that is, in theory, the infected hosts will be completely immune. Finally, by comparing the simulation results and numerical analysis, the basic agreement between the two curves supports the validity of the mathematical model. Our future work will be focusing on taking both the delay and double-quarantine strategy into account and further expanding the scale of our simulation work.

  3. V/STOL propulsion control analysis: Phase 2, task 5-9

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Typical V/STOL propulsion control requirements were derived for transition between vertical and horizontal flight using the General Electric RALS (Remote Augmented Lift System) concept. Steady-state operating requirements were defined for a typical Vertical-to-Horizontal transition and for a typical Horizontal-to-Vertical transition. Control mode requirements were established and multi-variable regulators developed for individual operating conditions. Proportional/Integral gain schedules were developed and were incorporated into a transition controller with capabilities for mode switching and manipulated variable reassignment. A non-linear component-level transient model of the engine was developed and utilized to provide a preliminary check-out of the controller logic. An inlet and nozzle effects model was developed for subsequent incorporation into the engine model and an aircraft model was developed for preliminary flight transition simulations. A condition monitoring development plan was developed and preliminary design requirements established. The Phase 1 long-range technology plan was refined and restructured toward the development of a real-time high fidelity transient model of a supersonic V/STOL propulsion system and controller for use in a piloted simulation program at NASA-Ames.

  4. A Mobile, Collaborative, Real Time Task List for Inpatient Environments

    PubMed Central

    Ho, T.; Pelletier, A.; Al Ayubi, S.; Bourgeois, F.

    2015-01-01

    Summary Background Inpatient teams commonly track their tasks using paper checklists that are not shared between team members. Team members frequently communicate redundantly in order to prevent errors. Methods We created a mobile, collaborative, real-time task list application on the iOS platform. The application listed tasks for each patient, allowed users to check them off as completed, and transmitted that information to all other team members. In this report, we qualitatively describe our experience designing and piloting the application with an inpatient pediatric ward team at an academic pediatric hospital. Results We successfully created the tasklist application, however team members showed limited usage. Conclusion Physicians described that they preferred the immediacy and familiarity of paper, and did not experience an efficiency benefit when using the electronic tasklist. PMID:26767063

  5. Research on the adaptive optical control technology based on DSP

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolu; Xue, Qiao; Zeng, Fa; Zhao, Junpu; Zheng, Kuixing; Su, Jingqin; Dai, Wanjun

    2018-02-01

    Adaptive optics is a real-time compensation technique using high speed support system for wavefront errors caused by atmospheric turbulence. However, the randomness and instantaneity of atmospheric changing introduce great difficulties to the design of adaptive optical systems. A large number of complex real-time operations lead to large delay, which is an insurmountable problem. To solve this problem, hardware operation and parallel processing strategy are proposed, and a high-speed adaptive optical control system based on DSP is developed. The hardware counter is used to check the system. The results show that the system can complete a closed loop control in 7.1ms, and improve the controlling bandwidth of the adaptive optical system. Using this system, the wavefront measurement and closed loop experiment are carried out, and obtain the good results.

  6. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation

    PubMed Central

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449

  7. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation.

    PubMed

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.

  8. Temporal Precedence Checking for Switched Models and its Application to a Parallel Landing Protocol

    NASA Technical Reports Server (NTRS)

    Duggirala, Parasara Sridhar; Wang, Le; Mitra, Sayan; Viswanathan, Mahesh; Munoz, Cesar A.

    2014-01-01

    This paper presents an algorithm for checking temporal precedence properties of nonlinear switched systems. This class of properties subsume bounded safety and capture requirements about visiting a sequence of predicates within given time intervals. The algorithm handles nonlinear predicates that arise from dynamics-based predictions used in alerting protocols for state-of-the-art transportation systems. It is sound and complete for nonlinear switch systems that robustly satisfy the given property. The algorithm is implemented in the Compare Execute Check Engine (C2E2) using validated simulations. As a case study, a simplified model of an alerting system for closely spaced parallel runways is considered. The proposed approach is applied to this model to check safety properties of the alerting logic for different operating conditions such as initial velocities, bank angles, aircraft longitudinal separation, and runway separation.

  9. A dynamic motion simulator for future European docking systems

    NASA Technical Reports Server (NTRS)

    Brondino, G.; Marchal, PH.; Grimbert, D.; Noirault, P.

    1990-01-01

    Europe's first confrontation with docking in space will require extensive testing to verify design and performance and to qualify hardware. For this purpose, a Docking Dynamics Test Facility (DDTF) was developed. It allows reproduction on the ground of the same impact loads and relative motion dynamics which would occur in space during docking. It uses a 9 degree of freedom, servo-motion system, controlled by a real time computer, which simulates the docking spacecraft in a zero-g environment. The test technique involves and active loop based on six axis force and torque detection, a mathematical simulation of individual spacecraft dynamics, and a 9 degree of freedom servomotion of which 3 DOFs allow extension of the kinematic range to 5 m. The configuration was checked out by closed loop tests involving spacecraft control models and real sensor hardware. The test facility at present has an extensive configuration that allows evaluation of both proximity control and docking systems. It provides a versatile tool to verify system design, hardware items and performance capabilities in the ongoing HERMES and COLUMBUS programs. The test system is described and its capabilities are summarized.

  10. System identification of a small low-cost unmanned aerial vehicle using flight data from low-cost sensors

    NASA Astrophysics Data System (ADS)

    Hoffer, Nathan Von

    Remote sensing has traditionally been done with satellites and manned aircraft. While. these methods can yield useful scientificc data, satellites and manned aircraft have limitations in data frequency, process time, and real time re-tasking. Small low-cost unmanned aerial vehicles (UAVs) provide greater possibilities for personal scientic research than traditional remote sensing platforms. Precision aerial data requires an accurate vehicle dynamics model for controller development, robust flight characteristics, and fault tolerance. One method of developing a model is system identification (system ID). In this thesis system ID of a small low-cost fixed-wing T-tail UAV is conducted. The linerized longitudinal equations of motion are derived from first principles. Foundations of Recursive Least Squares (RLS) are presented along with RLS with an Error Filtering Online Learning scheme (EFOL). Sensors, data collection, data consistency checking, and data processing are described. Batch least squares (BLS) and BLS with EFOL are used to identify aerodynamic coecoefficients of the UAV. Results of these two methods with flight data are discussed.

  11. Big data prediction of durations for online collective actions based on peak's timing

    NASA Astrophysics Data System (ADS)

    Nie, Shizhao; Wang, Zheng; Pujia, Wangmo; Nie, Yuan; Lu, Peng

    2018-02-01

    Peak Model states that each collective action has a life circle, which contains four periods of "prepare", "outbreak", "peak", and "vanish"; and the peak determines the max energy and the whole process. The peak model's re-simulation indicates that there seems to be a stable ratio between the peak's timing (TP) and the total span (T) or duration of collective actions, which needs further validations through empirical data of collective actions. Therefore, the daily big data of online collective actions is applied to validate the model; and the key is to check the ratio between peak's timing and the total span. The big data is obtained from online data recording & mining of websites. It is verified by the empirical big data that there is a stable ratio between TP and T; furthermore, it seems to be normally distributed. This rule holds for both the general cases and the sub-types of collective actions. Given the distribution of the ratio, estimated probability density function can be obtained, and therefore the span can be predicted via the peak's timing. Under the scenario of big data, the instant span (how long the collective action lasts or when it ends) will be monitored and predicted in real-time. With denser data (Big Data), the estimation of the ratio's distribution gets more robust, and the prediction of collective actions' spans or durations will be more accurate.

  12. Safety analytics for integrating crash frequency and real-time risk modeling for expressways.

    PubMed

    Wang, Ling; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2017-07-01

    To find crash contributing factors, there have been numerous crash frequency and real-time safety studies, but such studies have been conducted independently. Until this point, no researcher has simultaneously analyzed crash frequency and real-time crash risk to test whether integrating them could better explain crash occurrence. Therefore, this study aims at integrating crash frequency and real-time safety analyses using expressway data. A Bayesian integrated model and a non-integrated model were built: the integrated model linked the crash frequency and the real-time models by adding the logarithm of the estimated expected crash frequency in the real-time model; the non-integrated model independently estimated the crash frequency and the real-time crash risk. The results showed that the integrated model outperformed the non-integrated model, as it provided much better model results for both the crash frequency and the real-time models. This result indicated that the added component, the logarithm of the expected crash frequency, successfully linked and provided useful information to the two models. This study uncovered few variables that are not typically included in the crash frequency analysis. For example, the average daily standard deviation of speed, which was aggregated based on speed at 1-min intervals, had a positive effect on crash frequency. In conclusion, this study suggested a methodology to improve the crash frequency and real-time models by integrating them, and it might inspire future researchers to understand crash mechanisms better. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Real-time line matching from stereo images using a nonparametric transform of spatial relations and texture information

    NASA Astrophysics Data System (ADS)

    Park, Jonghee; Yoon, Kuk-Jin

    2015-02-01

    We propose a real-time line matching method for stereo systems. To achieve real-time performance while retaining a high level of matching precision, we first propose a nonparametric transform to represent the spatial relations between neighboring lines and nearby textures as a binary stream. Since the length of a line can vary across images, the matching costs between lines are computed within an overlap area (OA) based on the binary stream. The OA is determined for each line pair by employing the properties of a rectified image pair. Finally, the line correspondence is determined using a winner-takes-all method with a left-right consistency check. To reduce the computational time requirements further, we filter out unreliable matching candidates in advance based on their rectification properties. The performance of the proposed method was compared with state-of-the-art methods in terms of the computational time, matching precision, and recall. The proposed method required 47 ms to match lines from an image pair in the KITTI dataset with an average precision of 95%. We also verified the proposed method under image blur, illumination variation, and viewpoint changes.

  14. Truth of Varying Shades: Analyzing Language in Fake News and Political Fact-Checking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rashkin, Hannah J.; Choi, Eunsol; Jang, Jin Yea

    We present an analytic study on the language of news media in the context of political fact-checking and fake news detection. We compare the language of real news with that of satire, hoax, and propaganda to find linguistic cues for untruthful text. To probe the feasibility of automatic political fact-checking, we present a case study based on PolitiFact.com using their factuality judgments on a 6-point scale. Experimental results show that while media fact-checking remains to be an open research question, stylistic cues can help determine the truthfulness of text.

  15. The TRIDEC Virtual Tsunami Atlas - customized value-added simulation data products for Tsunami Early Warning generated on compute clusters

    NASA Astrophysics Data System (ADS)

    Löwe, P.; Hammitzsch, M.; Babeyko, A.; Wächter, J.

    2012-04-01

    The development of new Tsunami Early Warning Systems (TEWS) requires the modelling of spatio-temporal spreading of tsunami waves both recorded from past events and hypothetical future cases. The model results are maintained in digital repositories for use in TEWS command and control units for situation assessment once a real tsunami occurs. Thus the simulation results must be absolutely trustworthy, in a sense that the quality of these datasets is assured. This is a prerequisite as solid decision making during a crisis event and the dissemination of dependable warning messages to communities under risk will be based on them. This requires data format validity, but even more the integrity and information value of the content, being a derived value-added product derived from raw tsunami model output. Quality checking of simulation result products can be done in multiple ways, yet the visual verification of both temporal and spatial spreading characteristics for each simulation remains important. The eye of the human observer still remains an unmatched tool for the detection of irregularities. This requires the availability of convenient, human-accessible mappings of each simulation. The improvement of tsunami models necessitates the changes in many variables, including simulation end-parameters. Whenever new improved iterations of the general models or underlying spatial data are evaluated, hundreds to thousands of tsunami model results must be generated for each model iteration, each one having distinct initial parameter settings. The use of a Compute Cluster Environment (CCE) of sufficient size allows the automated generation of all tsunami-results within model iterations in little time. This is a significant improvement to linear processing on dedicated desktop machines or servers. This allows for accelerated/improved visual quality checking iterations, which in turn can provide a positive feedback into the overall model improvement iteratively. An approach to set-up and utilize the CCE has been implemented by the project Collaborative, Complex, and Critical Decision Processes in Evolving Crises (TRIDEC) funded under the European Union's FP7. TRIDEC focuses on real-time intelligent information management in Earth management. The addressed challenges include the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulations and data fusion tools. Additionally, TRIDEC adopts enhancements of Service Oriented Architecture (SOA) principles in terms of Event Driven Architecture (EDA) design. As a next step the implemented CCE's services to generate derived and customized simulation products are foreseen to be provided via an EDA service for on-demand processing for specific threat-parameters and to accommodate for model improvements.

  16. Applying MDA to SDR for Space to Model Real-time Issues

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2007-01-01

    NASA space communications systems have the challenge of designing SDRs with highly-constrained Size, Weight and Power (SWaP) resources. A study is being conducted to assess the effectiveness of applying the MDA Platform-Independent Model (PIM) and one or more Platform-Specific Models (PSM) specifically to address NASA space domain real-time issues. This paper will summarize our experiences with applying MDA to SDR for Space to model real-time issues. Real-time issues to be examined, measured, and analyzed are: meeting waveform timing requirements and efficiently applying Real-time Operating System (RTOS) scheduling algorithms, applying safety control measures, and SWaP verification. Real-time waveform algorithms benchmarked with the worst case environment conditions under the heaviest workload will drive the SDR for Space real-time PSM design.

  17. Using State Merging and State Pruning to Address the Path Explosion Problem Faced by Symbolic Execution

    DTIC Science & Technology

    2014-06-19

    urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n

  18. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  19. An Accurate GPS-IMU/DR Data Fusion Method for Driverless Car Based on a Set of Predictive Models and Grid Constraints

    PubMed Central

    Wang, Shiyao; Deng, Zhidong; Yin, Gang

    2016-01-01

    A high-performance differential global positioning system (GPS)  receiver with real time kinematics provides absolute localization for driverless cars. However, it is not only susceptible to multipath effect but also unable to effectively fulfill precise error correction in a wide range of driving areas. This paper proposes an accurate GPS–inertial measurement unit (IMU)/dead reckoning (DR) data fusion method based on a set of predictive models and occupancy grid constraints. First, we employ a set of autoregressive and moving average (ARMA) equations that have different structural parameters to build maximum likelihood models of raw navigation. Second, both grid constraints and spatial consensus checks on all predictive results and current measurements are required to have removal of outliers. Navigation data that satisfy stationary stochastic process are further fused to achieve accurate localization results. Third, the standard deviation of multimodal data fusion can be pre-specified by grid size. Finally, we perform a lot of field tests on a diversity of real urban scenarios. The experimental results demonstrate that the method can significantly smooth small jumps in bias and considerably reduce accumulated position errors due to DR. With low computational complexity, the position accuracy of our method surpasses existing state-of-the-arts on the same dataset and the new data fusion method is practically applied in our driverless car. PMID:26927108

  20. An Accurate GPS-IMU/DR Data Fusion Method for Driverless Car Based on a Set of Predictive Models and Grid Constraints.

    PubMed

    Wang, Shiyao; Deng, Zhidong; Yin, Gang

    2016-02-24

    A high-performance differential global positioning system (GPS)  receiver with real time kinematics provides absolute localization for driverless cars. However, it is not only susceptible to multipath effect but also unable to effectively fulfill precise error correction in a wide range of driving areas. This paper proposes an accurate GPS-inertial measurement unit (IMU)/dead reckoning (DR) data fusion method based on a set of predictive models and occupancy grid constraints. First, we employ a set of autoregressive and moving average (ARMA) equations that have different structural parameters to build maximum likelihood models of raw navigation. Second, both grid constraints and spatial consensus checks on all predictive results and current measurements are required to have removal of outliers. Navigation data that satisfy stationary stochastic process are further fused to achieve accurate localization results. Third, the standard deviation of multimodal data fusion can be pre-specified by grid size. Finally, we perform a lot of field tests on a diversity of real urban scenarios. The experimental results demonstrate that the method can significantly smooth small jumps in bias and considerably reduce accumulated position errors due to DR. With low computational complexity, the position accuracy of our method surpasses existing state-of-the-arts on the same dataset and the new data fusion method is practically applied in our driverless car.

  1. Model Checking - My 27-Year Quest to Overcome the State Explosion Problem

    NASA Technical Reports Server (NTRS)

    Clarke, Ed

    2009-01-01

    Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.

  2. Prospective validation of a near real-time EHR-integrated automated SOFA score calculator.

    PubMed

    Aakre, Christopher; Franco, Pablo Moreno; Ferreyra, Micaela; Kitson, Jaben; Li, Man; Herasevich, Vitaly

    2017-07-01

    We created an algorithm for automated Sequential Organ Failure Assessment (SOFA) score calculation within the Electronic Health Record (EHR) to facilitate detection of sepsis based on the Third International Consensus Definitions for Sepsis and Septic Shock (SEPSIS-3) clinical definition. We evaluated the accuracy of near real-time and daily automated SOFA score calculation compared with manual score calculation. Automated SOFA scoring computer programs were developed using available EHR data sources and integrated into a critical care focused patient care dashboard at Mayo Clinic in Rochester, Minnesota. We prospectively compared the accuracy of automated versus manual calculation for a sample of patients admitted to the medical intensive care unit at Mayo Clinic Hospitals in Rochester, Minnesota and Jacksonville, Florida. Agreement was calculated with Cohen's kappa statistic. Reason for discrepancy was tabulated during manual review. Random spot check comparisons were performed 134 times on 27 unique patients, and daily SOFA score comparisons were performed for 215 patients over a total of 1206 patient days. Agreement between automatically scored and manually scored SOFA components for both random spot checks (696 pairs, κ=0.89) and daily calculation (5972 pairs, κ=0.89) was high. The most common discrepancies were in the respiratory component (inaccurate fraction of inspired oxygen retrieval; 200/1206) and creatinine (normal creatinine in patients with no urine output on dialysis; 128/1094). 147 patients were at risk of developing sepsis after intensive care unit admission, 10 later developed sepsis confirmed by chart review. All were identified before onset of sepsis with the ΔSOFA≥2 point criterion and 46 patients were false-positives. Near real-time automated SOFA scoring was found to have strong agreement with manual score calculation and may be useful for the detection of sepsis utilizing the new SEPSIS-3 definition. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing

    DTIC Science & Technology

    2002-08-01

    simulation and actual execution. KEYWORDS: Model Continuity, Modeling, Simulation, Experimental Frame, Real Time Systems , Intelligent Systems...the methodology for a stand-alone real time system. Then it will scale up to distributed real time systems . For both systems, step-wise simulation...MODEL CONTINUITY Intelligent real time systems monitor, respond to, or control, an external environment. This environment is connected to the digital

  4. F100 Multivariable Control Synthesis Program. Computer Implementation of the F100 Multivariable Control Algorithm

    NASA Technical Reports Server (NTRS)

    Soeder, J. F.

    1983-01-01

    As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.

  5. Estimating Real-Time Zenith Tropospheric Delay over Africa Using IGS-RTS Products

    NASA Astrophysics Data System (ADS)

    Abdelazeem, M.

    2017-12-01

    Zenith Tropospheric Delay (ZTD) is a crucial parameter for atmospheric modeling, severe weather monitoring and forecasting applications. Currently, the international global navigation satellite system (GNSS) real-time service (IGS-RTS) products are used extensively in real-time atmospheric modeling applications. The objective of this study is to develop a real time zenith tropospheric delay estimation model over Africa using the IGS-RTS products. The real-time ZTDs are estimated based on the real-time precise point positioning (PPP) solution. GNSS observations from a number of reference stations are processed over a period of 7 days. Then, the estimated real-time ZTDs are compared with the IGS tropospheric products counterparts. The findings indicate that the estimated real-time ZTDs have millimeter level accuracy in comparison with the IGS counterparts.

  6. Model Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster

  7. MPST Software: grl_pef_check

    NASA Technical Reports Server (NTRS)

    Call, Jared A.; Kwok, John H.; Fisher, Forest W.

    2013-01-01

    This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.

  8. Malthus on long swings: the general case.

    PubMed

    Dooley, P C

    1988-02-01

    3 major assumptions provided the basis to Malthus' theory of population: food is necessary to human existence; passion between man and woman is necessary and will continue nearly in its present state; and the power of population is indefinitely greater than the earth's power to produce subsistence for humans. With this as his base, Malthus proposed the thesis that strong and constant forces need to hold the superior power of population over subsistence in check. The forces include both positive checks, e.g., infant mortality, and preventive checks, e.g., foregoing early marriage. Malthus evidently had a theory of long swings in mind because he began his essay questioning whether humankind will experience unlimited improvement or a state oscillating between happiness and misery. Waterman (1987) offers a new interpretation of Malthus' theory of long swings, concluding that "the Malthusian theory of oscillations' as sketched in the 'Essay on Population' may justly be represented by a zig-zag path of real wages." 2 questions arise: does the text literally mean what Waterman suggests; and is the text consistent with Malthus' general position. The quotation offered by Wasserman focuses on a special case that illustrates how oscillations might take place but fails to represent Malthus' general position. In any society the population's response to wages determines the "level" of subsistence. Due to the different living habits in each state, the subsistence level varies from state to state, and Malthus devotes much of the 1st "Essay" to discussing what determines the living habits and the subsistence level in different countries. In Malthus' theory of long swings, real wages do not follow a "zig-zag" path. This is due to the fact that neither the accumulation of capital nor the growth of population behaves as he proposes. Whenever the rate of profit is sufficiently attractive, capital accumulates, and the response of population to a change in wages depends on a complex of forces, termed by Malthus as positive and preventive checks. Generally, the path of wages over time is dependent on the prevailing conditions at a particular time and place. excerpt

  9. Expression of genes responsible for cell morphogenesis involved in differentiation in porcine buccal pouch mucosal cells during long-term primary culture and real-time proliferation in vitro.

    PubMed

    Dyszkiewicz-Konwińska, M; Bryja, A; Jopek, K; Budna, J; Khozmi, R; Jeseta, M; Bukowska, D; Antosik, P; Bruska, M; Nowicki, M; Zabel, M; Kempisty, B

    2017-01-01

    Recently, using experimental animal model, we demonstrated that porcine buccal pouch mucosal cells reflect increased proliferation capability during primary cultivation in vitro. Although the histological structure and morphogenesis in oral cavity is well recognized, the molecular mechanisms which regulate this process still need further investigation. This study was aimed to analyze the molecular marker expression profile involved in morphogenesis and differentiation capacity of porcine buccal pouch mucosal cells during their long-term primary cultivation in vitro. The experiment was performed on buccal pouch mucosal cells isolated from 80 pubertal crossbred Landrace gilts. After collection, the cells were treated enzymatically and transferred into a primary in vitro culture (IVC) system and cultured for 30 days. The cells were collected for RNA isolation after 7, 15 and 30 days of IVC and were checked for their real-time proliferative status using the RTCA system. We found an increased expression of FN1 and SOX9 genes when calculated against ACTB after 7, and 30 days of IVC, (P less than 0.01, P less than 0.001, respectively). The CXCL12 mRNA was down-regulated after 7, 15 and 30 days of IVC, but not statistically significant. Similar expression profile was observed when calculated against HPRT, however, DAB2 was found to be higher expressed at day 15 of IVC, (P less than 0.05). The cell index measured during real-time cell proliferation was substantially increased between 96 h and 147h of IVC and reached the log phase. Since FN1 and SOX9 revealed significant increase of expression after long-term culture in vitro, it is suggested that expression of these differentiation and stemness genes is accompanied by cell proliferation. Moreover, FN1 and SOX9 might be recognized as new markers of buccal pouch mucosal cell proliferation and differentiation in pigs in in vitro primary culture model.

  10. Hybrid monitoring scheme for end-to-end performance enhancement of multicast-based real-time media

    NASA Astrophysics Data System (ADS)

    Park, Ju-Won; Kim, JongWon

    2004-10-01

    As real-time media applications based on IP multicast networks spread widely, end-to-end QoS (quality of service) provisioning for these applications have become very important. To guarantee the end-to-end QoS of multi-party media applications, it is essential to monitor the time-varying status of both network metrics (i.e., delay, jitter and loss) and system metrics (i.e., CPU and memory utilization). In this paper, targeting the multicast-enabled AG (Access Grid) a next-generation group collaboration tool based on multi-party media services, the applicability of hybrid monitoring scheme that combines active and passive monitoring is investigated. The active monitoring measures network-layer metrics (i.e., network condition) with probe packets while the passive monitoring checks both application-layer metrics (i.e., user traffic condition by analyzing RTCP packets) and system metrics. By comparing these hybrid results, we attempt to pinpoint the causes of performance degradation and explore corresponding reactions to improve the end-to-end performance. The experimental results show that the proposed hybrid monitoring can provide useful information to coordinate the performance improvement of multi-party real-time media applications.

  11. Detecting past changes of effective population size

    PubMed Central

    Nikolic, Natacha; Chevalet, Claude

    2014-01-01

    Understanding and predicting population abundance is a major challenge confronting scientists. Several genetic models have been developed using microsatellite markers to estimate the present and ancestral effective population sizes. However, to get an overview on the evolution of population requires that past fluctuation of population size be traceable. To address the question, we developed a new model estimating the past changes of effective population size from microsatellite by resolving coalescence theory and using approximate likelihoods in a Monte Carlo Markov Chain approach. The efficiency of the model and its sensitivity to gene flow and to assumptions on the mutational process were checked using simulated data and analysis. The model was found especially useful to provide evidence of transient changes of population size in the past. The times at which some past demographic events cannot be detected because they are too ancient and the risk that gene flow may suggest the false detection of a bottleneck are discussed considering the distribution of coalescence times. The method was applied on real data sets from several Atlantic salmon populations. The method called VarEff (Variation of Effective size) was implemented in the R package VarEff and is made available at https://qgsp.jouy.inra.fr and at http://cran.r-project.org/web/packages/VarEff. PMID:25067949

  12. Mathematical modeling, analysis and Markov Chain Monte Carlo simulation of Ebola epidemics

    NASA Astrophysics Data System (ADS)

    Tulu, Thomas Wetere; Tian, Boping; Wu, Zunyou

    Ebola virus infection is a severe infectious disease with the highest case fatality rate which become the global public health treat now. What makes the disease the worst of all is no specific effective treatment available, its dynamics is not much researched and understood. In this article a new mathematical model incorporating both vaccination and quarantine to study the dynamics of Ebola epidemic has been developed and comprehensively analyzed. The existence as well as uniqueness of the solution to the model is also verified and the basic reproduction number is calculated. Besides, stability conditions are also checked and finally simulation is done using both Euler method and one of the top ten most influential algorithm known as Markov Chain Monte Carlo (MCMC) method. Different rates of vaccination to predict the effect of vaccination on the infected individual over time and that of quarantine are discussed. The results show that quarantine and vaccination are very effective ways to control Ebola epidemic. From our study it was also seen that there is less possibility of an individual for getting Ebola virus for the second time if they survived his/her first infection. Last but not least real data has been fitted to the model, showing that it can used to predict the dynamic of Ebola epidemic.

  13. Model-Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.

  14. A Model-Driven Approach for Telecommunications Network Services Definition

    NASA Astrophysics Data System (ADS)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  15. Model analysis of check dam impacts on long-term sediment and water budgets in southeast Arizona, USA

    USGS Publications Warehouse

    Norman, Laura M.; Niraula, Rewati

    2016-01-01

    The objective of this study was to evaluate the effect of check dam infrastructure on soil and water conservation at the catchment scale using the Soil and Water Assessment Tool (SWAT). This paired watershed study includes a watershed treated with over 2000 check dams and a Control watershed which has none, in the West Turkey Creek watershed, Southeast Arizona, USA. SWAT was calibrated for streamflow using discharge documented during the summer of 2013 at the Control site. Model results depict the necessity to eliminate lateral flow from SWAT models of aridland environments, the urgency to standardize geospatial soils data, and the care for which modelers must document altering parameters when presenting findings. Performance was assessed using the percent bias (PBIAS), with values of ±2.34%. The calibrated model was then used to examine the impacts of check dams at the Treated watershed. Approximately 630 tons of sediment is estimated to be stored behind check dams in the Treated watershed over the 3-year simulation, increasing water quality for fish habitat. A minimum precipitation event of 15 mm was necessary to instigate the detachment of soil, sediments, or rock from the study area, which occurred 2% of the time. The resulting watershed model is useful as a predictive framework and decision-support tool to consider long-term impacts of restoration and potential for future restoration.

  16. Formal Verification of Safety Properties for Aerospace Systems Through Algorithms Based on Exhaustive State-Space Exploration

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco

    2004-01-01

    The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce aviation accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems. Attempts to verify RSM with NuSMV and SPIN have failed due to excessive memory consumption.

  17. Evaluation of properties over phylogenetic trees using stochastic logics.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2016-06-14

    Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.

  18. Dielectric function of a model insulator

    NASA Astrophysics Data System (ADS)

    Rezvani, G. A.; Friauf, Robert J.

    1993-04-01

    We have calculated a dielectric response function ɛ(q,ω) using the random-phase approximation for a model insulator originally proposed by Fry [Phys. Rev. 179, 892 (1969)]. We treat narrow and wide band-gap insulators for the purpose of using results in the simulation of secondary-electron emission from insulators. Therefore, it is important to take into account the contribution of the first and second conduction bands. For the real part of the dielectric function we perform a numerical principal value integration over the first and second Brillouin zone. For the imaginary part we perform a numerical integration involving the δ function that results from the conservation of energy. In order to check the validity of our numerical integration methods we perform a Kramers-Kronig transform of the real part and compare it with the directly calculated imaginary part and vice versa. We discuss fitting the model to the static dielectric constant and the f-sum rule. Then we display the wave number and frequency dependence for solid argon, KCl, and model Si.

  19. Web-based video monitoring of CT and MRI procedures

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Dahlbom, Magdalena; Kho, Hwa T.; Valentino, Daniel J.; McCoy, J. Michael

    2000-05-01

    A web-based video transmission of images from CT and MRI consoles was implemented in an Intranet environment for real- time monitoring of ongoing procedures. Images captured from the consoles are compressed to video resolution and broadcasted through a web server. When called upon, the attending radiologists can view these live images on any computer within the secured Intranet network. With adequate compression, these images can be displayed simultaneously in different locations at a rate of 2 to 5 images/sec through standard LAN. The quality of the images being insufficient for diagnostic purposes, our users survey showed that they were suitable for supervising a procedure, positioning the imaging slices and for routine quality checking before completion of a study. The system was implemented at UCLA to monitor 9 CTs and 6 MRIs distributed in 4 buildings. This system significantly improved the radiologists productivity by saving precious time spent in trips between reading rooms and examination rooms. It also improved patient throughput by reducing the waiting time for the radiologists to come to check a study before moving the patient from the scanner.

  20. An experimental manipulation of responsibility in children: a test of the inflated responsibility model of obsessive-compulsive disorder.

    PubMed

    Reeves, J; Reynolds, S; Coker, S; Wilson, C

    2010-09-01

    The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9-12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children's self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children's perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children's self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children. (c) 2010 Elsevier Ltd. All rights reserved.

  1. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  2. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  3. Real-Time Dynamic Modeling - Data Information Requirements and Flight Test Results

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Smith, Mark S.

    2008-01-01

    Practical aspects of identifying dynamic models for aircraft in real time were studied. Topics include formulation of an equation-error method in the frequency domain to estimate non-dimensional stability and control derivatives in real time, data information content for accurate modeling results, and data information management techniques such as data forgetting, incorporating prior information, and optimized excitation. Real-time dynamic modeling was applied to simulation data and flight test data from a modified F-15B fighter aircraft, and to operational flight data from a subscale jet transport aircraft. Estimated parameter standard errors and comparisons with results from a batch output-error method in the time domain were used to demonstrate the accuracy of the identified real-time models.

  4. Real-Time Dynamic Modeling - Data Information Requirements and Flight Test Results

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Smith, Mark S.

    2010-01-01

    Practical aspects of identifying dynamic models for aircraft in real time were studied. Topics include formulation of an equation-error method in the frequency domain to estimate non-dimensional stability and control derivatives in real time, data information content for accurate modeling results, and data information management techniques such as data forgetting, incorporating prior information, and optimized excitation. Real-time dynamic modeling was applied to simulation data and flight test data from a modified F-15B fighter aircraft, and to operational flight data from a subscale jet transport aircraft. Estimated parameter standard errors, prediction cases, and comparisons with results from a batch output-error method in the time domain were used to demonstrate the accuracy of the identified real-time models.

  5. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  6. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  7. Transient Turbine Engine Modeling with Hardware-in-the-Loop Power Extraction (PREPRINT)

    DTIC Science & Technology

    2008-07-01

    Furthermore, it must be compatible with a real - time operating system that is capable of running the simulation. For some models, especially those that use...problem of interfacing the engine/control model to a real - time operating system and associated lab hardware becomes a problem of interfacing these...model in real-time. This requires the use of a real - time operating system and a compatible I/O (input/output) board. Figure 1 illustrates the HIL

  8. Network Reduction Algorithm for Developing Distribution Feeders for Real-Time Simulators: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Nelson, Austin; Prabakar, Kumaraguru

    As advanced grid-support functions (AGF) become more widely used in grid-connected photovoltaic (PV) inverters, utilities are increasingly interested in their impacts when implemented in the field. These effects can be understood by modeling feeders in real-time systems and testing PV inverters using power hardware-in-the-loop (PHIL) techniques. This paper presents a novel feeder model reduction algorithm using a Monte Carlo method that enables large feeders to be solved and operated on real-time computing platforms. Two Hawaiian Electric feeder models in Synergi Electric's load flow software were converted to reduced order models in OpenDSS, and subsequently implemented in the OPAL-RT real-time digitalmore » testing platform. Smart PV inverters were added to the real-time model with AGF responses modeled after characterizing commercially available hardware inverters. Finally, hardware inverters were tested in conjunction with the real-time model using PHIL techniques so that the effects of AGFs on the choice feeders could be analyzed.« less

  9. Improved Short-Term Clock Prediction Method for Real-Time Positioning.

    PubMed

    Lv, Yifei; Dai, Zhiqiang; Zhao, Qile; Yang, Sheng; Zhou, Jinning; Liu, Jingnan

    2017-06-06

    The application of real-time precise point positioning (PPP) requires real-time precise orbit and clock products that should be predicted within a short time to compensate for the communication delay or data gap. Unlike orbit correction, clock correction is difficult to model and predict. The widely used linear model hardly fits long periodic trends with a small data set and exhibits significant accuracy degradation in real-time prediction when a large data set is used. This study proposes a new prediction model for maintaining short-term satellite clocks to meet the high-precision requirements of real-time clocks and provide clock extrapolation without interrupting the real-time data stream. Fast Fourier transform (FFT) is used to analyze the linear prediction residuals of real-time clocks. The periodic terms obtained through FFT are adopted in the sliding window prediction to achieve a significant improvement in short-term prediction accuracy. This study also analyzes and compares the accuracy of short-term forecasts (less than 3 h) by using different length observations. Experimental results obtained from International GNSS Service (IGS) final products and our own real-time clocks show that the 3-h prediction accuracy is better than 0.85 ns. The new model can replace IGS ultra-rapid products in the application of real-time PPP. It is also found that there is a positive correlation between the prediction accuracy and the short-term stability of on-board clocks. Compared with the accuracy of the traditional linear model, the accuracy of the static PPP using the new model of the 2-h prediction clock in N, E, and U directions is improved by about 50%. Furthermore, the static PPP accuracy of 2-h clock products is better than 0.1 m. When an interruption occurs in the real-time model, the accuracy of the kinematic PPP solution using 1-h clock prediction product is better than 0.2 m, without significant accuracy degradation. This model is of practical significance because it solves the problems of interruption and delay in data broadcast in real-time clock estimation and can meet the requirements of real-time PPP.

  10. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    PubMed

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  12. Incorporating Digisonde Traces into the Ionospheric Data Assimilation Three Dimensional (IDA3D) Algorithm

    DTIC Science & Technology

    2006-05-11

    examined. These data were processed by the Automatic Real Time Ionogram Scaler with True Height ( ARTIST ) [Reinisch and Huang, 1983] program into electron...IDA3D. The data is locally available and previously quality checked. In addition, IDA3D maps using ARTIST -calculated profiles from hand scaled...ionograms are available for comparison. The first test run of the IDA3D used only O-mode autoscaled virtual height profiles from five different digisondes

  13. HAL/S - The programming language for Shuttle

    NASA Technical Reports Server (NTRS)

    Martin, F. H.

    1974-01-01

    HAL/S is a higher order language and system, now operational, adopted by NASA for programming Space Shuttle on-board software. Program reliability is enhanced through language clarity and readability, modularity through program structure, and protection of code and data. Salient features of HAL/S include output orientation, automatic checking (with strictly enforced compiler rules), the availability of linear algebra, real-time control, a statement-level simulator, and compiler transferability (for applying HAL/S to additional object and host computers). The compiler is described briefly.

  14. Crew procedures development techniques

    NASA Technical Reports Server (NTRS)

    Arbet, J. D.; Benbow, R. L.; Hawk, M. L.; Mangiaracina, A. A.; Mcgavern, J. L.; Spangler, M. C.

    1975-01-01

    The study developed requirements, designed, developed, checked out and demonstrated the Procedures Generation Program (PGP). The PGP is a digital computer program which provides a computerized means of developing flight crew procedures based on crew action in the shuttle procedures simulator. In addition, it provides a real time display of procedures, difference procedures, performance data and performance evaluation data. Reconstruction of displays is possible post-run. Data may be copied, stored on magnetic tape and transferred to the document processor for editing and documentation distribution.

  15. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  16. Active structural acoustic control of helicopter interior multifrequency noise using input-output-based hybrid control

    NASA Astrophysics Data System (ADS)

    Ma, Xunjun; Lu, Yang; Wang, Fengjiao

    2017-09-01

    This paper presents the recent advances in reduction of multifrequency noise inside helicopter cabin using an active structural acoustic control system, which is based on active gearbox struts technical approach. To attenuate the multifrequency gearbox vibrations and resulting noise, a new scheme of discrete model predictive sliding mode control has been proposed based on controlled auto-regressive moving average model. Its implementation only needs input/output data, hence a broader frequency range of controlled system is modelled and the burden on the state observer design is released. Furthermore, a new iteration form of the algorithm is designed, improving the developing efficiency and run speed. To verify the algorithm's effectiveness and self-adaptability, experiments of real-time active control are performed on a newly developed helicopter model system. The helicopter model can generate gear meshing vibration/noise similar to a real helicopter with specially designed gearbox and active struts. The algorithm's control abilities are sufficiently checked by single-input single-output and multiple-input multiple-output experiments via different feedback strategies progressively: (1) control gear meshing noise through attenuating vibrations at the key points on the transmission path, (2) directly control the gear meshing noise in the cabin using the actuators. Results confirm that the active control system is practical for cancelling multifrequency helicopter interior noise, which also weakens the frequency-modulation of the tones. For many cases, the attenuations of the measured noise exceed the level of 15 dB, with maximum reduction reaching 31 dB. Also, the control process is demonstrated to be smoother and faster.

  17. Segment Fixed Priority Scheduling for Self Suspending Real Time Tasks

    DTIC Science & Technology

    2016-08-11

    Segment-Fixed Priority Scheduling for Self-Suspending Real -Time Tasks Junsung Kim, Department of Electrical and Computer Engineering, Carnegie...4 2.1 Application of a Multi-Segment Self-Suspending Real -Time Task Model ............................. 5 3 Fixed Priority Scheduling...1 Figure 2: A multi-segment self-suspending real -time task model

  18. Real Time Navigation-Assisted Orbital Wall Reconstruction in Blowout Fractures.

    PubMed

    Shin, Ho Seong; Kim, Se Young; Cha, Han Gyu; Han, Ba Leun; Nam, Seung Min

    2016-03-01

    Limitation in performing restoration of orbital structures is the narrow, deep, and dark surgical field, which makes it difficult to view the operative site directly. To avoid perioperative complications from this limitation, the authors have evaluated the usefulness of computer-assisted navigation techniques in surgical treatment of blowout fracture. Total 37 patients (14 medial orbital wall fractures and 23 inferior orbital wall fractures) with facial deformities had surgical treatment under the guide of navigation system between September 2012 and January 2015. All 37 patients were treated successfully and safely with navigation-assisted surgery without any complications, including diplopia, retrobulbar hematoma, globe injury, implant migration, and blindness. Blowout fracture can be treated safely under guidance of a surgical navigation system. In orbital surgery, navigation-assisted technology could give rise to improvements in the functional and aesthetic outcome and checking the position of the instruments on the surgical site in real time, without injuring important anatomic structures.

  19. Topological properties of the limited penetrable horizontal visibility graph family

    NASA Astrophysics Data System (ADS)

    Wang, Minggang; Vilela, André L. M.; Du, Ruijin; Zhao, Longfeng; Dong, Gaogao; Tian, Lixin; Stanley, H. Eugene

    2018-05-01

    The limited penetrable horizontal visibility graph algorithm was recently introduced to map time series in complex networks. In this work, we extend this algorithm to create a directed-limited penetrable horizontal visibility graph and an image-limited penetrable horizontal visibility graph. We define two algorithms and provide theoretical results on the topological properties of these graphs associated with different types of real-value series. We perform several numerical simulations to check the accuracy of our theoretical results. Finally, we present an application of the directed-limited penetrable horizontal visibility graph to measure real-value time series irreversibility and an application of the image-limited penetrable horizontal visibility graph that discriminates noise from chaos. We also propose a method to measure the systematic risk using the image-limited penetrable horizontal visibility graph, and the empirical results show the effectiveness of our proposed algorithms.

  20. Bayesian structural equation modeling: a more flexible representation of substantive theory.

    PubMed

    Muthén, Bengt; Asparouhov, Tihomir

    2012-09-01

    This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed Bayesian approach is particularly beneficial in applications where parameters are added to a conventional model such that a nonidentified model is obtained if maximum-likelihood estimation is applied. This approach is useful for measurement aspects of latent variable modeling, such as with confirmatory factor analysis, and the measurement part of structural equation modeling. Two application areas are studied, cross-loadings and residual correlations in confirmatory factor analysis. An example using a full structural equation model is also presented, showing an efficient way to find model misspecification. The approach encompasses 3 elements: model testing using posterior predictive checking, model estimation, and model modification. Monte Carlo simulations and real data are analyzed using Mplus. The real-data analyses use data from Holzinger and Swineford's (1939) classic mental abilities study, Big Five personality factor data from a British survey, and science achievement data from the National Educational Longitudinal Study of 1988.

  1. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  2. Development of an Operational TS Dataset Production System for the Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Kim, Sung Dae; Park, Hyuk Min; Kim, Young Ho; Park, Kwang Soon

    2017-04-01

    An operational TS (Temperature and Salinity) dataset production system was developed to provide near real-time data to the data assimilation system periodically. It collects the latest 15 days' TS data of the north western pacific area (20°N - 55°N, 110°E - 150°E), applies QC tests to the archived data and supplies them to numerical prediction models of KIOST (Korea Institute of Ocean Science and Technology). The latest real-time TS data are collected from Argo GDAC and GTSPP data server every week. Argo data are downloaded from /latest_data directory of Argo GDAC. Because many duplicated data exist when all profile data are extracted from all Argo netCDF files, DB system is used to avoid duplication. All metadata (float ID, location, observation date and time, etc) of all Argo floats is stored into Database system and a Matlab program was developed to manipulate DB data, to check the duplication and to exclude duplicated data. GTSPP data are downloaded from /realtime directory of GTSPP data service. The latest data except ARGO data are extracted from the original data. Another Matlab program was coded to inspect all collected data using 10 QC tests and produce final dataset which can be used by the assimilation system. Three regional range tests to inspect annual, seasonal and monthly variations are included in the QC procedures. The C program was developed to provide regional ranges to data managers. It can calculate upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. The final TS dataset contains the latest 15 days' TS data in netCDF format. It is updated every week and transmitted to numerical modeler of KIOST for operational use.

  3. Passivity-based Robust Control of Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul G.; Joshi, Suresh M. (Technical Monitor)

    2000-01-01

    This report provides a brief summary of the research work performed over the duration of the cooperative research agreement between NASA Langley Research Center and Kansas State University. The cooperative agreement which was originally for the duration the three years was extended by another year through no-cost extension in order to accomplish the goals of the project. The main objective of the research was to develop passivity-based robust control methodology for passive and non-passive aerospace systems. The focus of the first-year's research was limited to the investigation of passivity-based methods for the robust control of Linear Time-Invariant (LTI) single-input single-output (SISO), open-loop stable, minimum-phase non-passive systems. The second year's focus was mainly on extending the passivity-based methodology to a larger class of non-passive LTI systems which includes unstable and nonminimum phase SISO systems. For LTI non-passive systems, five different passification. methods were developed. The primary effort during the years three and four was on the development of passification methodology for MIMO systems, development of methods for checking robustness of passification, and developing synthesis techniques for passifying compensators. For passive LTI systems optimal synthesis procedure was also developed for the design of constant-gain positive real controllers. For nonlinear passive systems, numerical optimization-based technique was developed for the synthesis of constant as well as time-varying gain positive-real controllers. The passivity-based control design methodology developed during the duration of this project was demonstrated by its application to various benchmark examples. These example systems included longitudinal model of an F-18 High Alpha Research Vehicle (HARV) for pitch axis control, NASA's supersonic transport wind tunnel model, ACC benchmark model, 1-D acoustic duct model, piezo-actuated flexible link model, and NASA's Benchmark Active Controls Technology (BACT) Wing model. Some of the stability results for linear passive systems were also extended to nonlinear passive systems. Several publications and conference presentations resulted from this research.

  4. Wearable, Flexible, and Multifunctional Healthcare Device with an ISFET Chemical Sensor for Simultaneous Sweat pH and Skin Temperature Monitoring.

    PubMed

    Nakata, Shogo; Arie, Takayuki; Akita, Seiji; Takei, Kuniharu

    2017-03-24

    Real-time daily healthcare monitoring may increase the chances of predicting and diagnosing diseases in their early stages which, currently, occurs most frequently during medical check-ups. Next-generation noninvasive healthcare devices, such as flexible multifunctional sensor sheets designed to be worn on skin, are considered to be highly suitable candidates for continuous real-time health monitoring. For healthcare applications, acquiring data on the chemical state of the body, alongside physical characteristics such as body temperature and activity, are extremely important for predicting and identifying potential health conditions. To record these data, in this study, we developed a wearable, flexible sweat chemical sensor sheet for pH measurement, consisting of an ion-sensitive field-effect transistor (ISFET) integrated with a flexible temperature sensor: we intend to use this device as the foundation of a fully integrated, wearable healthcare patch in the future. After characterizing the performance, mechanical flexibility, and stability of the sensor, real-time measurements of sweat pH and skin temperature are successfully conducted through skin contact. This flexible integrated device has the potential to be developed into a chemical sensor for sweat for applications in healthcare and sports.

  5. Detection of medically important Candida species by absolute quantitation real-time polymerase chain reaction.

    PubMed

    Than, Leslie Thian Lung; Chong, Pei Pei; Ng, Kee Peng; Seow, Heng Fong

    2015-01-01

    The number of invasive candidiasis cases has risen especially with an increase in the number of immunosuppressed and immunocom promised patients. The early detection of Candida species which is specific and sensitive is important in determining the correct administration of antifungal drugs to patients. This study aims to develop a method for the detection, identification and quantitation of medically important Candida species through quantitative polymerase chain reaction (qPCR). The isocitrate lyase (ICL) gene which is not found in mammals was chosen as the target gene of real-time PCR. Absolute quantitation of the gene copy number was achieved by constructing the plasmid containing the ICL gene which is used to generate standard curve. Twenty fungal species, two bacterial species and human DNA were tested to check the specificity of the detection method. All eight Candida species were successfully detected, identified and quantitated based on the ICL gene. A seven-log range of the gene copy number and a minimum detection limit of 10(3) copies were achieved. A one-tube absolute quantification real-time PCR that differentiates medically important Candida species via individual unique melting temperature was achieved. Analytical sensitivity and specificity were not compromised.

  6. Real-time radon monitoring at Stromboli volcano: influence of environmental parameters on 222Rn degassing

    NASA Astrophysics Data System (ADS)

    Cigolini, C.; Ripepe, M.; Poggi, P.; Laiolo, M.

    2008-12-01

    Two real-time stations for radon monitoring are currently operative at Stromboli volcano. The 222Rn electronic dosimeters are interfaced with an electronic board connected to a radiomodem for wireless data transfer (through a directional antenna) to a receiving station at the volcano observatory (COA). Radon activity data and enviromental parameters (soil temperature and atmospheric pressure) are sampled every 15 minutes and are instantaneously elaborated and transferred via web so that they can be checked in remote. Collected time series show that there is an overall inverse correlation between radon emissions and seasonal temperature variations. Signal processing analysis show that radon emissions in sectors of diffuse degassing are modulated by tidal forces as well. In addition, radon activities recorded at the summit station, located along the summit fracture zone where the gas flux is concentrated, are positively correlated with changes in atmospheric pressure and confirm the occurrence of the 'atmospheric stack effect'. It is not excluded that this process may play an active role in modulating Stromboli explosivity.

  7. Discordance between 'actual' and 'scheduled' check-in times at a heart failure clinic

    PubMed Central

    Joyce, Emer; Gandesbery, Benjamin T.; Blackstone, Eugene H.; Taylor, David O.; Tang, W. H. Wilson; Starling, Randall C.; Hachamovitch, Rory

    2017-01-01

    Introduction A 2015 Institute Of Medicine statement “Transforming Health Care Scheduling and Access: Getting to Now”, has increased concerns regarding patient wait times. Although waiting times have been widely studied, little attention has been paid to the role of patient arrival times as a component of this phenomenon. To this end, we investigated patterns of patient arrival at scheduled ambulatory heart failure (HF) clinic appointments and studied its predictors. We hypothesized that patients are more likely to arrive later than scheduled, with progressively later arrivals later in the day. Methods and results Using a business intelligence database we identified 6,194 unique patients that visited the Cleveland Clinic Main Campus HF clinic between January, 2015 and January, 2017. This clinic served both as a tertiary referral center and a community HF clinic. Transplant and left ventricular assist device (LVAD) visits were excluded. Punctuality was defined as the difference between ‘actual’ and ‘scheduled’ check-in times, whereby negative values (i.e., early punctuality) were patients who checked-in early. Contrary to our hypothesis, we found that patients checked-in late only a minority of the time (38% of visits). Additionally, examining punctuality by appointment hour slot we found that patients scheduled after 8AM had progressively earlier check-in times as the day progressed (P < .001 for trend). In both a Random Forest-Regression framework and linear regression models the most important risk-adjusted predictors of early punctuality were: later in the day appointment hour slot, patient having previously been to the hospital, age in the early 70s, and white race. Conclusions Patients attending a mixed population ambulatory HF clinic check-in earlier than scheduled times, with progressive discrepant intervals throughout the day. This finding may have significant implications for provider utilization and resource planning in order to maximize clinic efficiency. The impact of elective early arrival on patient’s perceived wait times requires further study. PMID:29136649

  8. Discordance between 'actual' and 'scheduled' check-in times at a heart failure clinic.

    PubMed

    Gorodeski, Eiran Z; Joyce, Emer; Gandesbery, Benjamin T; Blackstone, Eugene H; Taylor, David O; Tang, W H Wilson; Starling, Randall C; Hachamovitch, Rory

    2017-01-01

    A 2015 Institute Of Medicine statement "Transforming Health Care Scheduling and Access: Getting to Now", has increased concerns regarding patient wait times. Although waiting times have been widely studied, little attention has been paid to the role of patient arrival times as a component of this phenomenon. To this end, we investigated patterns of patient arrival at scheduled ambulatory heart failure (HF) clinic appointments and studied its predictors. We hypothesized that patients are more likely to arrive later than scheduled, with progressively later arrivals later in the day. Using a business intelligence database we identified 6,194 unique patients that visited the Cleveland Clinic Main Campus HF clinic between January, 2015 and January, 2017. This clinic served both as a tertiary referral center and a community HF clinic. Transplant and left ventricular assist device (LVAD) visits were excluded. Punctuality was defined as the difference between 'actual' and 'scheduled' check-in times, whereby negative values (i.e., early punctuality) were patients who checked-in early. Contrary to our hypothesis, we found that patients checked-in late only a minority of the time (38% of visits). Additionally, examining punctuality by appointment hour slot we found that patients scheduled after 8AM had progressively earlier check-in times as the day progressed (P < .001 for trend). In both a Random Forest-Regression framework and linear regression models the most important risk-adjusted predictors of early punctuality were: later in the day appointment hour slot, patient having previously been to the hospital, age in the early 70s, and white race. Patients attending a mixed population ambulatory HF clinic check-in earlier than scheduled times, with progressive discrepant intervals throughout the day. This finding may have significant implications for provider utilization and resource planning in order to maximize clinic efficiency. The impact of elective early arrival on patient's perceived wait times requires further study.

  9. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  10. Real-time inversions for finite fault slip models and rupture geometry based on high-rate GPS data

    USGS Publications Warehouse

    Minson, Sarah E.; Murray, Jessica R.; Langbein, John O.; Gomberg, Joan S.

    2015-01-01

    We present an inversion strategy capable of using real-time high-rate GPS data to simultaneously solve for a distributed slip model and fault geometry in real time as a rupture unfolds. We employ Bayesian inference to find the optimal fault geometry and the distribution of possible slip models for that geometry using a simple analytical solution. By adopting an analytical Bayesian approach, we can solve this complex inversion problem (including calculating the uncertainties on our results) in real time. Furthermore, since the joint inversion for distributed slip and fault geometry can be computed in real time, the time required to obtain a source model of the earthquake does not depend on the computational cost. Instead, the time required is controlled by the duration of the rupture and the time required for information to propagate from the source to the receivers. We apply our modeling approach, called Bayesian Evidence-based Fault Orientation and Real-time Earthquake Slip, to the 2011 Tohoku-oki earthquake, 2003 Tokachi-oki earthquake, and a simulated Hayward fault earthquake. In all three cases, the inversion recovers the magnitude, spatial distribution of slip, and fault geometry in real time. Since our inversion relies on static offsets estimated from real-time high-rate GPS data, we also present performance tests of various approaches to estimating quasi-static offsets in real time. We find that the raw high-rate time series are the best data to use for determining the moment magnitude of the event, but slightly smoothing the raw time series helps stabilize the inversion for fault geometry.

  11. Real-Time Global Nonlinear Aerodynamic Modeling for Learn-To-Fly

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2016-01-01

    Flight testing and modeling techniques were developed to accurately identify global nonlinear aerodynamic models for aircraft in real time. The techniques were developed and demonstrated during flight testing of a remotely-piloted subscale propeller-driven fixed-wing aircraft using flight test maneuvers designed to simulate a Learn-To-Fly scenario. Prediction testing was used to evaluate the quality of the global models identified in real time. The real-time global nonlinear aerodynamic modeling algorithm will be integrated and further tested with learning adaptive control and guidance for NASA Learn-To-Fly concept flight demonstrations.

  12. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Urnes, James M., Sr. (Inventor); Smith, Timothy A. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  13. An Analysis of Input/Output Paradigms for Real-Time Systems

    DTIC Science & Technology

    1990-07-01

    timing and concurrency aspects of real - time systems . This paper illustrates how to build a mathematical model of the schedulability of a real-time...various design alternatives. The primary characteristic that distinguishes real-time system from non- real - time systems is the importance of time. The

  14. Application of RT-PCR in formalin-fixed and paraffin-embedded lung cancer tissues.

    PubMed

    Zhang, Fan; Wang, Zhuo-min; Liu, Hong-yu; Bai, Yun; Wei, Sen; Li, Ying; Wang, Min; Chen, Jun; Zhou, Qing-hua

    2010-01-01

    To analyze gene expression in formalin-fixed, paraffin-embedded lung cancer tissues using modified method. Total RNA from frozen tissues was extracted using TRIZOL reagent. RNA was extracted from formalin-fixed, paraffin-embedded tissues by digestion with proteinase K before the acid-phenol:chloroform extraction and carrier precipitation. We modified this method by using a higher concentration of proteinase K and a longer digestion time, optimized to 16 hours. RT-PCR and real-time RT-PCR were used to check reproducibility and the concordance between frozen and paraffin-embedded samples. The results showed that the RNA extracted from the paraffin-embedded lung tissues had high quality with the most fragment length between 28S and 18S bands (about 1000 to 2000 bases). The housekeeping gene GUSB exhibited low variation of expression in frozen and paraffin-embedded lung tissues, whereas PGK1 had the lowest variation in lymphoma tissues. Furthermore, real-time PCR analysis of the expression of known prognostic genes in non-small cell lung carcinoma (NSCLC) demonstrated an extremely high correlation (r>0.880) between the paired frozen and formalin-fixed, paraffin-embedded specimens. This improved method of RNA extraction is suitable for real-time quantitative RT-PCR, and may be used for global gene expression profiling of paraffin-embedded tissues.

  15. Research in Distributed Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.

    1997-01-01

    This document summarizes the progress we have made on our study of issues concerning the schedulability of real-time systems. Our study has produced several results in the scalability issues of distributed real-time systems. In particular, we have used our techniques to resolve schedulability issues in distributed systems with end-to-end requirements. During the next year (1997-98), we propose to extend the current work to address the modeling and workload characterization issues in distributed real-time systems. In particular, we propose to investigate the effect of different workload models and component models on the design and the subsequent performance of distributed real-time systems.

  16. An optimized work-flow to reduce time-to-detection of carbapenemase-producing Enterobacteriaceae (CPE) using direct testing from rectal swabs.

    PubMed

    O'Connor, C; Kiernan, M G; Finnegan, C; O'Hara, M; Power, L; O'Connell, N H; Dunne, C P

    2017-05-04

    Rapid detection of patients with carbapenemase-producing Enterobacteriaceae (CPE) is essential for the prevention of nosocomial cross-transmission, allocation of isolation facilities and to protect patient safety. Here, we aimed to design a new laboratory work-flow, utilizing existing laboratory resources, in order to reduce time-to-diagnosis of CPE. A review of the current CPE testing processes and of the literature was performed to identify a real-time commercial polymerase chain reaction (PCR) assay that could facilitate batch testing of CPE clinical specimens, with adequate CPE gene coverage. Stool specimens (210) were collected; CPE-positive inpatients (n = 10) and anonymized community stool specimens (n = 200). Rectal swabs (eSwab™) were inoculated from collected stool specimens and a manual DNA extraction method (QIAamp® DNA Stool Mini Kit) was employed. Extracted DNA was then processed on the Check-Direct CPE® assay. The three step process of making the eSwab™, extracting DNA manually and running the Check-Direct CPE® assay, took <5 min, 1 h 30 min and 1 h 50 min, respectively. It was time efficient with a result available in under 4 h, comparing favourably with the existing method of CPE screening; average time-to-diagnosis of 48/72 h. Utilizing this CPE work-flow would allow a 'same-day' result. Antimicrobial susceptibility testing results, as is current practice, would remain a 'next-day' result. In conclusion, the Check-Direct CPE® assay was easily integrated into a local laboratory work-flow and could facilitate a large volume of CPE screening specimens in a single batch, making it cost-effective and convenient for daily CPE testing.

  17. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    NASA Technical Reports Server (NTRS)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  18. Modelling spoilage of fresh turbot and evaluation of a time-temperature integrator (TTI) label under fluctuating temperature.

    PubMed

    Nuin, Maider; Alfaro, Begoña; Cruz, Ziortza; Argarate, Nerea; George, Susie; Le Marc, Yvan; Olley, June; Pin, Carmen

    2008-10-31

    Kinetic models were developed to predict the microbial spoilage and the sensory quality of fresh fish and to evaluate the efficiency of a commercial time-temperature integrator (TTI) label, Fresh Check(R), to monitor shelf life. Farmed turbot (Psetta maxima) samples were packaged in PVC film and stored at 0, 5, 10 and 15 degrees C. Microbial growth and sensory attributes were monitored at regular time intervals. The response of the Fresh Check device was measured at the same temperatures during the storage period. The sensory perception was quantified according to a global sensory indicator obtained by principal component analysis as well as to the Quality Index Method, QIM, as described by Rahman and Olley [Rahman, H.A., Olley, J., 1984. Assessment of sensory techniques for quality assessment of Australian fish. CSIRO Tasmanian Regional Laboratory. Occasional paper n. 8. Available from the Australian Maritime College library. Newnham. Tasmania]. Both methods were found equally valid to monitor the loss of sensory quality. The maximum specific growth rate of spoilage bacteria, the rate of change of the sensory indicators and the rate of change of the colour measurements of the TTI label were modelled as a function of temperature. The temperature had a similar effect on the bacteria, sensory and Fresh Check kinetics. At the time of sensory rejection, the bacterial load was ca. 10(5)-10(6) cfu/g. The end of shelf life indicated by the Fresh Check label was close to the sensory rejection time. The performance of the models was validated under fluctuating temperature conditions by comparing the predicted and measured values for all microbial, sensory and TTI responses. The models have been implemented in a Visual Basic add-in for Excel called "Fish Shelf Life Prediction (FSLP)". This program predicts sensory acceptability and growth of spoilage bacteria in fish and the response of the TTI at constant and fluctuating temperature conditions. The program is freely available at http://www.azti.es/muestracontenido.asp?idcontenido=980&content=15&nodo1=30&nodo2=0.

  19. Sub-micron accurate track navigation method ``Navi'' for the analysis of Nuclear Emulsion

    NASA Astrophysics Data System (ADS)

    Yoshioka, T.; Yoshida, J.; Kodama, K.

    2011-03-01

    Sub-micron accurate track navigation in Nuclear Emulsion is realized by using low energy signals detected by automated Nuclear Emulsion read-out systems. Using those much dense ``noise'', about 104 times larger than the real tracks, the accuracy of the track position navigation reaches to be sub micron only by using the information of a microscope field of view, 200 micron times 200 micron. This method is applied to OPERA analysis in Japan, i.e. support of human eye checks of the candidate tracks, confirmation of neutrino interaction vertexes and to embed missing track segments to the track data read-out by automated systems.

  20. Real-time simulation of three-dimensional shoulder girdle and arm dynamics.

    PubMed

    Chadwick, Edward K; Blana, Dimitra; Kirsch, Robert F; van den Bogert, Antonie J

    2014-07-01

    Electrical stimulation is a promising technology for the restoration of arm function in paralyzed individuals. Control of the paralyzed arm under electrical stimulation, however, is a challenging problem that requires advanced controllers and command interfaces for the user. A real-time model describing the complex dynamics of the arm would allow user-in-the-loop type experiments where the command interface and controller could be assessed. Real-time models of the arm previously described have not included the ability to model the independently controlled scapula and clavicle, limiting their utility for clinical applications of this nature. The goal of this study therefore was to evaluate the performance and mechanical behavior of a real-time, dynamic model of the arm and shoulder girdle. The model comprises seven segments linked by eleven degrees of freedom and actuated by 138 muscle elements. Polynomials were generated to describe the muscle lines of action to reduce computation time, and an implicit, first-order Rosenbrock formulation of the equations of motion was used to increase simulation step-size. The model simulated flexion of the arm faster than real time, simulation time being 92% of actual movement time on standard desktop hardware. Modeled maximum isometric torque values agreed well with values from the literature, showing that the model simulates the moment-generating behavior of a real human arm. The speed of the model enables experiments where the user controls the virtual arm and receives visual feedback in real time. The ability to optimize potential solutions in simulation greatly reduces the burden on the user during development.

  1. Review of Real-Time Simulator and the Steps Involved for Implementation of a Model from MATLAB/SIMULINK to Real-Time

    NASA Astrophysics Data System (ADS)

    Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi

    2015-06-01

    Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.

  2. Development of a real-time PCR for detection of Staphylococcus pseudintermedius using a novel automated comparison of whole-genome sequences.

    PubMed

    Verstappen, Koen M; Huijbregts, Loes; Spaninks, Mirlin; Wagenaar, Jaap A; Fluit, Ad C; Duim, Birgitta

    2017-01-01

    Staphylococcus pseudintermedius is an opportunistic pathogen in dogs and cats and occasionally causes infections in humans. S. pseudintermedius is often resistant to multiple classes of antimicrobials. It requires a reliable detection so that it is not misidentified as S. aureus. Phenotypic and currently-used molecular-based diagnostic assays lack specificity or are labour-intensive using multiplex PCR or nucleic acid sequencing. The aim of this study was to identify a specific target for real-time PCR by comparing whole genome sequences of S. pseudintermedius and non-pseudintermedius.Genome sequences were downloaded from public repositories and supplemented by isolates that were sequenced in this study. A Perl-script was written that analysed 300-nt fragments from a reference genome sequence of S. pseudintermedius and checked if this sequence was present in other S. pseudintermedius genomes (n = 74) and non-pseudintermedius genomes (n = 138). Six sequences specific for S. pseudintermedius were identified (sequence length between 300-500 nt). One sequence, which was located in the spsJ gene, was used to develop primers and a probe. The real-time PCR showed 100% specificity when testing for S. pseudintermedius isolates (n = 54), and eight other staphylococcal species (n = 43). In conclusion, a novel approach by comparing whole genome sequences identified a sequence that is specific for S. pseudintermedius and provided a real-time PCR target for rapid and reliable detection of S. pseudintermedius.

  3. MOM: A meteorological data checking expert system in CLIPS

    NASA Technical Reports Server (NTRS)

    Odonnell, Richard

    1990-01-01

    Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.

  4. Tobacco outlet density, retailer cigarette sales without ID checks and enforcement of underage tobacco laws: associations with youths' cigarette smoking and beliefs.

    PubMed

    Lipperman-Kreda, Sharon; Grube, Joel W; Friend, Karen B; Mair, Christina

    2016-03-01

    To estimate the relationships of tobacco outlet density, cigarette sales without ID checks and local enforcement of underage tobacco laws with youth's life-time cigarette smoking, perceived availability of tobacco and perceived enforcement of underage tobacco laws and changes over time. The study involved: (a) three annual telephone surveys, (b) two annual purchase surveys in 2000 tobacco outlets and (c) interviews with key informants from local law enforcement agencies. Analyses were multi-level models (city, individual, time). A sample of 50 mid-sized non-contiguous cities in California, USA. A total of 1478 youths (aged 13-16 at wave 1, 52.2% male); 1061 participated in all waves. Measures at the individual level included life-time cigarette smoking, perceived availability and perceived enforcement. City-level measures included tobacco outlet density, cigarette sales without ID checks and compliance checks. Outlet density was associated positively with life-time smoking [OR = 1.12, P < 0.01]. An interaction between outlet density and wave (OR = 0.96, P < 0.05) suggested that higher density was associated more closely with life-time smoking at the earlier waves when respondents were younger. Greater density was associated positively with perceived availability (β = 0.02, P < 0.05) and negatively with perceived enforcement (β = -0.02, P < 0.01). Sales rate without checking IDs was related to greater perceived availability (β = 0.01, P < 0.01) and less perceived enforcement (β = -0.01, P < 0.01). Enforcement of underage tobacco laws was related positively to perceived enforcement (β = 0.06, P < 0.05). Higher tobacco outlet density may contribute to life-time smoking among youths. Density, sales without ID checks and enforcement levels may influence beliefs about access to cigarettes and enforcement of underage tobacco sales laws. © 2015 Society for the Study of Addiction.

  5. Possibilities and limits of Internet-based registers.

    PubMed

    Wild, Michael; Candrian, Aron; Wenda, Klaus

    2009-03-01

    The Internet is an inexpensive platform for the investigation of medical questions in case of low prevalence. By accessing www.ao-nailregister.org, every interested participant may participate in the English-language survey of the complications specific to the femoral nail. The address data of the participant, the anonymised key data of the patients and the medical parameters are entered. In real time, these data are checked for plausibility, evaluated and published on the Internet where they are freely accessible immediately. Because of national differences, data acquisition caused considerable difficulties at the beginning. In addition, wrong data were entered because of linguistic or contextual misunderstandings. After having reworked the questionnaire completely, facilitating data input and implementing an automated plausibility check, these difficulties could be cleared. In a next step, the automatic evaluation of the data was implemented. Only very few data still have to be checked for plausibility manually to exclude wrong entries, which cannot be verified by the computer. The effort required for data acquisition and evaluation of the Internet-based femoral nail register was reduced distinctly. The possibility of free international participation as well as the freely accessible representation of the results offers transparency.

  6. Optimization of the Timepix chip to measurement of radon, thoron and their progenies.

    PubMed

    Janik, Miroslaw; Ploc, Ondrej; Fiederle, Michael; Procz, Simon; Kavasi, Norbert

    2016-01-01

    Radon and thoron as well as their short-lived progenies are decay products of the radium and thorium series decays. They are the most important radionuclide elements with respect to public exposure. To utilize the semiconductor pixel radiation Timepix chip for the measurement of active and real-time alpha particles from radon, thoron and their progenies, it is necessary to check the registration and visualization of the chip. An energy check for radon, thoron and their progenies, as well as for (241)Am and(210)Po sources, was performed using the radon and thoron chambers at NIRS (National Institute of Radiological Sciences). The check found an energy resolution of 200 keV with a 14% efficiency as well as a linear dependency between the channel number (cluster volume) and the energy. The coefficient of determination r(2) of 0.99 for the range of 5 to 9 MeV was calculated. In addition, an offset for specific Timepix configurations between pre-calibration for low energy from 6 to 60 keV, and the actual calibration for alpha particles with energies from 4000 to 9000 keV, was detected. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Data quality assessment for comparative effectiveness research in distributed data networks

    PubMed Central

    Brown, Jeffrey; Kahn, Michael; Toh, Sengwee

    2015-01-01

    Background Electronic health information routinely collected during healthcare delivery and reimbursement can help address the need for evidence about the real-world effectiveness, safety, and quality of medical care. Often, distributed networks that combine information from multiple sources are needed to generate this real-world evidence. Objective We provide a set of field-tested best practices and a set of recommendations for data quality checking for comparative effectiveness research (CER) in distributed data networks. Methods Explore the requirements for data quality checking and describe data quality approaches undertaken by several existing multi-site networks. Results There are no established standards regarding how to evaluate the quality of electronic health data for CER within distributed networks. Data checks of increasing complexity are often employed, ranging from consistency with syntactic rules to evaluation of semantics and consistency within and across sites. Temporal trends within and across sites are widely used, as are checks of each data refresh or update. Rates of specific events and exposures by age group, sex, and month are also common. Discussion Secondary use of electronic health data for CER holds promise but is complex, especially in distributed data networks that incorporate periodic data refreshes. The viability of a learning health system is dependent on a robust understanding of the quality, validity, and optimal secondary uses of routinely collected electronic health data within distributed health data networks. Robust data quality checking can strengthen confidence in findings based on distributed data network. PMID:23793049

  8. Collaborative Resource Allocation

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Wax, Allan; Lam, Raymond; Baldwin, John; Borden, Chester

    2007-01-01

    Collaborative Resource Allocation Networking Environment (CRANE) Version 0.5 is a prototype created to prove the newest concept of using a distributed environment to schedule Deep Space Network (DSN) antenna times in a collaborative fashion. This program is for all space-flight and terrestrial science project users and DSN schedulers to perform scheduling activities and conflict resolution, both synchronously and asynchronously. Project schedulers can, for the first time, participate directly in scheduling their tracking times into the official DSN schedule, and negotiate directly with other projects in an integrated scheduling system. A master schedule covers long-range, mid-range, near-real-time, and real-time scheduling time frames all in one, rather than the current method of separate functions that are supported by different processes and tools. CRANE also provides private workspaces (both dynamic and static), data sharing, scenario management, user control, rapid messaging (based on Java Message Service), data/time synchronization, workflow management, notification (including emails), conflict checking, and a linkage to a schedule generation engine. The data structure with corresponding database design combines object trees with multiple associated mortal instances and relational database to provide unprecedented traceability and simplify the existing DSN XML schedule representation. These technologies are used to provide traceability, schedule negotiation, conflict resolution, and load forecasting from real-time operations to long-range loading analysis up to 20 years in the future. CRANE includes a database, a stored procedure layer, an agent-based middle tier, a Web service wrapper, a Windows Integrated Analysis Environment (IAE), a Java application, and a Web page interface.

  9. Recent advances in the Lesser Antilles observatories Part 2 : WebObs - an integrated web-based system for monitoring and networks management

    NASA Astrophysics Data System (ADS)

    Beauducel, François; Bosson, Alexis; Randriamora, Frédéric; Anténor-Habazac, Christian; Lemarchand, Arnaud; Saurel, Jean-Marie; Nercessian, Alexandre; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie

    2010-05-01

    Seismological and Volcanological observatories have common needs and often common practical problems for multi disciplinary data monitoring applications. In fact, access to integrated data in real-time and estimation of measurements uncertainties are keys for an efficient interpretation, but instruments variety, heterogeneity of data sampling and acquisition systems lead to difficulties that may hinder crisis management. In Guadeloupe observatory, we have developed in the last years an operational system that attempts to answer the questions in the context of a pluri-instrumental observatory. Based on a single computer server, open source scripts (Matlab, Perl, Bash, Nagios) and a Web interface, the system proposes: an extended database for networks management, stations and sensors (maps, station file with log history, technical characteristics, meta-data, photos and associated documents); a web-form interfaces for manual data input/editing and export (like geochemical analysis, some of the deformation measurements, ...); routine data processing with dedicated automatic scripts for each technique, production of validated data outputs, static graphs on preset moving time intervals, and possible e-mail alarms; computers, acquisition processes, stations and individual sensors status automatic check with simple criteria (files update and signal quality), displayed as synthetic pages for technical control. In the special case of seismology, WebObs includes a digital stripchart multichannel continuous seismogram associated with EarthWorm acquisition chain (see companion paper Part 1), event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps accessed through a user request form. This system leads to a real-time Internet access for integrated monitoring and becomes a strong support for scientists and technicians exchange, and is widely open to interdisciplinary real-time modeling. It has been set up at Martinique observatory and installation is planned this year at Montserrat Volcanological Observatory. It also in production at the geomagnetic observatory of Addis Abeba in Ethiopia.

  10. Real-Time Precise Point Positioning (RTPPP) with raw observations and its application in real-time regional ionospheric VTEC modeling

    NASA Astrophysics Data System (ADS)

    Liu, Teng; Zhang, Baocheng; Yuan, Yunbin; Li, Min

    2018-01-01

    Precise Point Positioning (PPP) is an absolute positioning technology mainly used in post data processing. With the continuously increasing demand for real-time high-precision applications in positioning, timing, retrieval of atmospheric parameters, etc., Real-Time PPP (RTPPP) and its applications have drawn more and more research attention in recent years. This study focuses on the models, algorithms and ionospheric applications of RTPPP on the basis of raw observations, in which high-precision slant ionospheric delays are estimated among others in real time. For this purpose, a robust processing strategy for multi-station RTPPP with raw observations has been proposed and realized, in which real-time data streams and State-Space-Representative (SSR) satellite orbit and clock corrections are used. With the RTPPP-derived slant ionospheric delays from a regional network, a real-time regional ionospheric Vertical Total Electron Content (VTEC) modeling method is proposed based on Adjusted Spherical Harmonic Functions and a Moving-Window Filter. SSR satellite orbit and clock corrections from different IGS analysis centers are evaluated. Ten globally distributed real-time stations are used to evaluate the positioning performances of the proposed RTPPP algorithms in both static and kinematic modes. RMS values of positioning errors in static/kinematic mode are 5.2/15.5, 4.7/17.4 and 12.8/46.6 mm, for north, east and up components, respectively. Real-time slant ionospheric delays from RTPPP are compared with those from the traditional Carrier-to-Code Leveling (CCL) method, in terms of function model, formal precision and between-receiver differences of short baseline. Results show that slant ionospheric delays from RTPPP are more precise and have a much better convergence performance than those from the CCL method in real-time processing. 30 real-time stations from the Asia-Pacific Reference Frame network are used to model the ionospheric VTECs over Australia in real time, with slant ionospheric delays from both RTPPP and CCL methods for comparison. RMS of the VTEC differences between RTPPP/CCL method and CODE final products is 0.91/1.09 TECU, and RMS of the VTEC differences between RTPPP and CCL methods is 0.67 TECU. Slant Total Electron Contents retrieved from different VTEC models are also validated with epoch-differenced Geometry-Free combinations of dual-frequency phase observations, and mean RMS values are 2.14, 2.33 and 2.07 TECU for RTPPP method, CCL method and CODE final products, respectively. This shows the superiority of RTPPP-derived slant ionospheric delays in real-time ionospheric VTEC modeling.

  11. Development of a Model of Soldier Effectiveness: Retranslation Materials and Results

    DTIC Science & Technology

    1987-05-01

    covering financial responsibility, particularly the family checking account . Consequent- ly, the bad check rate for the unit drop- ped from 70 a month...Alcohol, and Aggressive Acts " Showing prudence in financial management and responsibility in personal/family matters; avoiding alcohol and other drugs or...threatening others, etc. versus " Acting irresponsibly in financial or personal/family affairs such that command time is required to counsel or otherwise

  12. Do alcohol compliance checks decrease underage sales at neighboring establishments?

    PubMed

    Erickson, Darin J; Smolenski, Derek J; Toomey, Traci L; Carlin, Bradley P; Wagenaar, Alexander C

    2013-11-01

    Underage alcohol compliance checks conducted by law enforcement agencies can reduce the likelihood of illegal alcohol sales at checked alcohol establishments, and theory suggests that an alcohol establishment that is checked may warn nearby establishments that compliance checks are being conducted in the area. In this study, we examined whether the effects of compliance checks diffuse to neighboring establishments. We used data from the Complying with the Minimum Drinking Age trial, which included more than 2,000 compliance checks conducted at more than 900 alcohol establishments. The primary outcome was the sale of alcohol to a pseudo-underage buyer without the need for age identification. A multilevel logistic regression was used to model the effect of a compliance check at each establishment as well as the effect of compliance checks at neighboring establishments within 500 m (stratified into four equal-radius concentric rings), after buyer, license, establishment, and community-level variables were controlled for. We observed a decrease in the likelihood of establishments selling alcohol to underage youth after they had been checked by law enforcement, but these effects quickly decayed over time. Establishments that had a close neighbor (within 125 m) checked in the past 90 days were also less likely to sell alcohol to young-appearing buyers. The spatial effect of compliance checks on other establishments decayed rapidly with increasing distance. Results confirm the hypothesis that the effects of police compliance checks do spill over to neighboring establishments. These findings have implications for the development of an optimal schedule of police compliance checks.

  13. Equilibrium pricing in an order book environment: Case study for a spin model

    NASA Astrophysics Data System (ADS)

    Meudt, Frederik; Schmitt, Thilo A.; Schäfer, Rudi; Guhr, Thomas

    2016-07-01

    When modeling stock market dynamics, the price formation is often based on an equilibrium mechanism. In real stock exchanges, however, the price formation is governed by the order book. It is thus interesting to check if the resulting stylized facts of a model with equilibrium pricing change, remain the same or, more generally, are compatible with the order book environment. We tackle this issue in the framework of a case study by embedding the Bornholdt-Kaizoji-Fujiwara spin model into the order book dynamics. To this end, we use a recently developed agent based model that realistically incorporates the order book. We find realistic stylized facts. We conclude for the studied case that equilibrium pricing is not needed and that the corresponding assumption of a ;fundamental; price may be abandoned.

  14. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  15. Object oriented fault diagnosis system for space shuttle main engine redlines

    NASA Technical Reports Server (NTRS)

    Rogers, John S.; Mohapatra, Saroj Kumar

    1990-01-01

    A great deal of attention has recently been given to Artificial Intelligence research in the area of computer aided diagnostics. Due to the dynamic and complex nature of space shuttle red-line parameters, a research effort is under way to develop a real time diagnostic tool that will employ historical and engineering rulebases as well as a sensor validity checking. The capability of AI software development tools (KEE and G2) will be explored by applying object oriented programming techniques in accomplishing the diagnostic evaluation.

  16. The VA Computerized Patient Record — A First Look

    PubMed Central

    Anderson, Curtis L.; Meldrum, Kevin C.

    1994-01-01

    In support of its in-house DHCP Physician Order Entry/Results Reporting application, the VA is developing the first edition of a Computerized Patient Record. The system will feature a physician-oriented interface with real time, expert system-based order checking, a controlled vocabulary, a longitudinal repository of patient data, HL7 messaging support, a clinical reminder and warning system, and full integration with existing VA applications including lab, pharmacy, A/D/T, radiology, dietetics, surgery, vitals, allergy tracking, discharge summary, problem list, progress notes, consults, and online physician order entry. PMID:7949886

  17. Real-time inverse kinematics and inverse dynamics for lower limb applications using OpenSim

    PubMed Central

    Modenese, L.; Lloyd, D.G.

    2017-01-01

    Real-time estimation of joint angles and moments can be used for rapid evaluation in clinical, sport, and rehabilitation contexts. However, real-time calculation of kinematics and kinetics is currently based on approximate solutions or generic anatomical models. We present a real-time system based on OpenSim solving inverse kinematics and dynamics without simplifications at 2000 frame per seconds with less than 31.5ms of delay. We describe the software architecture, sensitivity analyses to minimise delays and errors, and compare offline and real-time results. This system has the potential to strongly impact current rehabilitation practices enabling the use of personalised musculoskeletal models in real-time. PMID:27723992

  18. Real-time inverse kinematics and inverse dynamics for lower limb applications using OpenSim.

    PubMed

    Pizzolato, C; Reggiani, M; Modenese, L; Lloyd, D G

    2017-03-01

    Real-time estimation of joint angles and moments can be used for rapid evaluation in clinical, sport, and rehabilitation contexts. However, real-time calculation of kinematics and kinetics is currently based on approximate solutions or generic anatomical models. We present a real-time system based on OpenSim solving inverse kinematics and dynamics without simplifications at 2000 frame per seconds with less than 31.5 ms of delay. We describe the software architecture, sensitivity analyses to minimise delays and errors, and compare offline and real-time results. This system has the potential to strongly impact current rehabilitation practices enabling the use of personalised musculoskeletal models in real-time.

  19. Thermal evaluation of laser exposures in an in vitro retinal model by microthermal sensing

    NASA Astrophysics Data System (ADS)

    Choi, Tae Y.; Denton, Michael L.; Noojin, Gary D.; Estlack, Larry E.; Shrestha, Ramesh; Rockwell, Benjamin A.; Thomas, Robert; Kim, Dongsik

    2014-09-01

    A temperature detection system using a micropipette thermocouple sensor was developed for use within mammalian cells during laser exposure with an 8.6-μm beam at 532 nm. We have demonstrated the capability of measuring temperatures at a single-cell level in the microscale range by inserting micropipette-based thermal sensors of size ranging from 2 to 4 μm into the membrane of a live retinal pigment epithelium (RPE) cell subjected to a laser beam. We setup the treatment groups of 532-nm laser-irradiated single RPE cell and in situ temperature recordings were made over time. Thermal profiles are given for representative cells experiencing damage resulting from exposures of 0.2 to 2 s. The measured maximum temperature rise for each cell ranges from 39 to 73°C the RPE cells showed a signature of death for all the cases reported herein. In order to check the cell viability, real-time fluorescence microscopy was used to identify the transition of pigmented RPE cells between viable and damaged states due to laser exposure.

  20. CrowdMag - Crowdsourcing magnetic data

    NASA Astrophysics Data System (ADS)

    Nair, M. C.; Boneh, N.; Chulliat, A.

    2014-12-01

    In the CrowdMag project, we explore whether digital magnetometers built in modern mobile phones can be used as scientific instruments to measure Earth's magnetic field. Most modern mobile phones have digital magnetometers to orient themselves. A phone's magnetometer measures three components of the local magnetic field with a typical sensitivity of about 150 to 600 nanotesla (nT). By combining data from vector magnetometers and accelerometers, phone's orientation is determined. Using phone's Internet connection, magnetic data and location are sent to a central server. At the server, we check quality of the magnetic data from all users and make the data available to the public as aggregate maps. We have two long-term goals. 1) Develop near-real-time models of Earth's time changing magnetic field by reducing man-made noise from crowdsourced data and combining it with geomagnetic data from other sources. 2) Improving accuracy of magnetic navigation by mapping magnetic noise sources (for e.g. power transformer and iron pipes). Key challenges to this endeavor are the low sensitivity of the phone's magnetometer and the noisy environment within and surrounding the phone. URL : http://www.ngdc.noaa.gov/geomag/crowdmag.shtml

  1. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm for BeiDou GEO satellites. The real-time positioning results prove that the GPS + BeiDou + Galileo RT-PPP comparing to GPS-only can effectively accelerate convergence time by about 60%, improve the positioning accuracy by about 30% and obtain averaged RMS 4 cm in horizontal and 6 cm in vertical; additionally RT-SPP accuracy in the prototype system can realize positioning accuracy with about averaged RMS 1 m in horizontal and 1.5-2 m in vertical, which are improved by 60% and 70% to SPP based on broadcast ephemeris, respectively.

  2. Real time hardware implementation of power converters for grid integration of distributed generation and STATCOM systems

    NASA Astrophysics Data System (ADS)

    Jaithwa, Ishan

    Deployment of smart grid technologies is accelerating. Smart grid enables bidirectional flows of energy and energy-related communications. The future electricity grid will look very different from today's power system. Large variable renewable energy sources will provide a greater portion of electricity, small DERs and energy storage systems will become more common, and utilities will operate many different kinds of energy efficiency. All of these changes will add complexity to the grid and require operators to be able to respond to fast dynamic changes to maintain system stability and security. This thesis investigates advanced control technology for grid integration of renewable energy sources and STATCOM systems by verifying them on real time hardware experiments using two different systems: d SPACE and OPAL RT. Three controls: conventional, direct vector control and the intelligent Neural network control were first simulated using Matlab to check the stability and safety of the system and were then implemented on real time hardware using the d SPACE and OPAL RT systems. The thesis then shows how dynamic-programming (DP) methods employed to train the neural networks are better than any other controllers where, an optimal control strategy is developed to ensure effective power delivery and to improve system stability. Through real time hardware implementation it is proved that the neural vector control approach produces the fastest response time, low overshoot, and, the best performance compared to the conventional standard vector control method and DCC vector control technique. Finally the entrepreneurial approach taken to drive the technologies from the lab to market via ORANGE ELECTRIC is discussed in brief.

  3. A fast semi-discrete Kansa method to solve the two-dimensional spatiotemporal fractional diffusion equation

    NASA Astrophysics Data System (ADS)

    Sun, HongGuang; Liu, Xiaoting; Zhang, Yong; Pang, Guofei; Garrard, Rhiannon

    2017-09-01

    Fractional-order diffusion equations (FDEs) extend classical diffusion equations by quantifying anomalous diffusion frequently observed in heterogeneous media. Real-world diffusion can be multi-dimensional, requiring efficient numerical solvers that can handle long-term memory embedded in mass transport. To address this challenge, a semi-discrete Kansa method is developed to approximate the two-dimensional spatiotemporal FDE, where the Kansa approach first discretizes the FDE, then the Gauss-Jacobi quadrature rule solves the corresponding matrix, and finally the Mittag-Leffler function provides an analytical solution for the resultant time-fractional ordinary differential equation. Numerical experiments are then conducted to check how the accuracy and convergence rate of the numerical solution are affected by the distribution mode and number of spatial discretization nodes. Applications further show that the numerical method can efficiently solve two-dimensional spatiotemporal FDE models with either a continuous or discrete mixing measure. Hence this study provides an efficient and fast computational method for modeling super-diffusive, sub-diffusive, and mixed diffusive processes in large, two-dimensional domains with irregular shapes.

  4. Principles of continuous quality improvement applied to intravenous therapy.

    PubMed

    Dunavin, M K; Lane, C; Parker, P E

    1994-01-01

    Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.

  5. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  6. Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application

    NASA Astrophysics Data System (ADS)

    Chen, Jinduan; Boccelli, Dominic L.

    2018-02-01

    Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.

  7. Model Checking Satellite Operational Procedures

    NASA Astrophysics Data System (ADS)

    Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri

    2011-08-01

    We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.

  8. Real Time Data Management for Estimating Probabilities of Incidents and Near Misses

    NASA Astrophysics Data System (ADS)

    Stanitsas, P. D.; Stephanedes, Y. J.

    2011-08-01

    Advances in real-time data collection, data storage and computational systems have led to development of algorithms for transport administrators and engineers that improve traffic safety and reduce cost of road operations. Despite these advances, problems in effectively integrating real-time data acquisition, processing, modelling and road-use strategies at complex intersections and motorways remain. These are related to increasing system performance in identification, analysis, detection and prediction of traffic state in real time. This research develops dynamic models to estimate the probability of road incidents, such as crashes and conflicts, and incident-prone conditions based on real-time data. The models support integration of anticipatory information and fee-based road use strategies in traveller information and management. Development includes macroscopic/microscopic probabilistic models, neural networks, and vector autoregressions tested via machine vision at EU and US sites.

  9. Capability of a Mobile Monitoring System to Provide Real-Time Data Broadcasting and Near Real-Time Source Attribution

    NASA Astrophysics Data System (ADS)

    Erickson, M.; Olaguer, J.; Wijesinghe, A.; Colvin, J.; Neish, B.; Williams, J.

    2014-12-01

    It is becoming increasingly important to understand the emissions and health effects of industrial facilities. Many areas have no or limited sustained monitoring capabilities, making it difficult to quantify the major pollution sources affecting human health, especially in fence line communities. Developments in real-time monitoring and micro-scale modeling offer unique ways to tackle these complex issues. This presentation will demonstrate the capability of coupling real-time observations with micro-scale modeling to provide real-time information and near real-time source attribution. The Houston Advanced Research Center constructed the Mobile Acquisition of Real-time Concentrations (MARC) laboratory. MARC consists of a Ford E-350 passenger van outfitted with a Proton Transfer Reaction Mass Spectrometer (PTR-MS) and meteorological equipment. This allows for the fast measurement of various VOCs important to air quality. The data recorded from the van is uploaded to an off-site database and the information is broadcast to a website in real-time. This provides for off-site monitoring of MARC's observations, which allows off-site personnel to provide immediate input to the MARC operators on how to best achieve project objectives. The information stored in the database can also be used to provide near real-time source attribution. An inverse model has been used to ascertain the amount, location, and timing of emissions based on MARC measurements in the vicinity of industrial sites. The inverse model is based on a 3D micro-scale Eulerian forward and adjoint air quality model known as the HARC model. The HARC model uses output from the Quick Urban and Industrial Complex (QUIC) wind model and requires a 3D digital model of the monitored facility based on lidar or industrial permit data. MARC is one of the instrument platforms deployed during the 2014 Benzene and other Toxics Exposure Study (BEE-TEX) in Houston, TX. The main goal of the study is to quantify and explain the origin of ambient exposure to hazardous air pollutants in an industrial fence line community near the Houston Ship Channel. Preliminary results derived from analysis of MARC observations during the BEE-TEX experiment will be presented.

  10. Exercise recognition for Kinect-based telerehabilitation.

    PubMed

    Antón, D; Goñi, A; Illarramendi, A

    2015-01-01

    An aging population and people's higher survival to diseases and traumas that leave physical consequences are challenging aspects in the context of an efficient health management. This is why telerehabilitation systems are being developed, to allow monitoring and support of physiotherapy sessions at home, which could reduce healthcare costs while also improving the quality of life of the users. Our goal is the development of a Kinect-based algorithm that provides a very accurate real-time monitoring of physical rehabilitation exercises and that also provides a friendly interface oriented both to users and physiotherapists. The two main constituents of our algorithm are the posture classification method and the exercises recognition method. The exercises consist of series of movements. Each movement is composed of an initial posture, a final posture and the angular trajectories of the limbs involved in the movement. The algorithm was designed and tested with datasets of real movements performed by volunteers. We also explain in the paper how we obtained the optimal values for the trade-off values for posture and trajectory recognition. Two relevant aspects of the algorithm were evaluated in our tests, classification accuracy and real-time data processing. We achieved 91.9% accuracy in posture classification and 93.75% accuracy in trajectory recognition. We also checked whether the algorithm was able to process the data in real-time. We found that our algorithm could process more than 20,000 postures per second and all the required trajectory data-series in real-time, which in practice guarantees no perceptible delays. Later on, we carried out two clinical trials with real patients that suffered shoulder disorders. We obtained an exercise monitoring accuracy of 95.16%. We present an exercise recognition algorithm that handles the data provided by Kinect efficiently. The algorithm has been validated in a real scenario where we have verified its suitability. Moreover, we have received a positive feedback from both users and the physiotherapists who took part in the tests.

  11. A real-time ionospheric model based on GNSS Precise Point Positioning

    NASA Astrophysics Data System (ADS)

    Tu, Rui; Zhang, Hongping; Ge, Maorong; Huang, Guanwen

    2013-09-01

    This paper proposes a method of real-time monitoring and modeling the ionospheric Total Electron Content (TEC) by Precise Point Positioning (PPP). Firstly, the ionospheric TEC and receiver’s Differential Code Biases (DCB) are estimated with the undifferenced raw observation in real-time, then the ionospheric TEC model is established based on the Single Layer Model (SLM) assumption and the recovered ionospheric TEC. In this study, phase observations with high precision are directly used instead of phase smoothed code observations. In addition, the DCB estimation is separated from the establishment of the ionospheric model which will limit the impacts of the SLM assumption impacts. The ionospheric model is established at every epoch for real time application. The method is validated with three different GNSS networks on a local, regional, and global basis. The results show that the method is feasible and effective, the real-time ionosphere and DCB results are very consistent with the IGS final products, with a bias of 1-2 TECU and 0.4 ns respectively.

  12. Extensions to the visual predictive check to facilitate model performance evaluation.

    PubMed

    Post, Teun M; Freijer, Jan I; Ploeger, Bart A; Danhof, Meindert

    2008-04-01

    The Visual Predictive Check (VPC) is a valuable and supportive instrument for evaluating model performance. However in its most commonly applied form, the method largely depends on a subjective comparison of the distribution of the simulated data with the observed data, without explicitly quantifying and relating the information in both. In recent adaptations to the VPC this drawback is taken into consideration by presenting the observed and predicted data as percentiles. In addition, in some of these adaptations the uncertainty in the predictions is represented visually. However, it is not assessed whether the expected random distribution of the observations around the predicted median trend is realised in relation to the number of observations. Moreover the influence of and the information residing in missing data at each time point is not taken into consideration. Therefore, in this investigation the VPC is extended with two methods to support a less subjective and thereby more adequate evaluation of model performance: (i) the Quantified Visual Predictive Check (QVPC) and (ii) the Bootstrap Visual Predictive Check (BVPC). The QVPC presents the distribution of the observations as a percentage, thus regardless the density of the data, above and below the predicted median at each time point, while also visualising the percentage of unavailable data. The BVPC weighs the predicted median against the 5th, 50th and 95th percentiles resulting from a bootstrap of the observed data median at each time point, while accounting for the number and the theoretical position of unavailable data. The proposed extensions to the VPC are illustrated by a pharmacokinetic simulation example and applied to a pharmacodynamic disease progression example.

  13. Hardware-In-The-Loop Power Extraction Using Different Real-Time Platforms (Postprint)

    DTIC Science & Technology

    2008-11-01

    each real - time operating system . However, discrepancies in test results obtained from the NI system can be resolved. This paper briefly details...same model in native Simulink. These results show that each real - time operating system can be configured to accurately run transient Simulink models

  14. A novel Online-to-Offline (O2O) model for pre-exposure prophylaxis and HIV testing scale up.

    PubMed

    Anand, Tarandeep; Nitpolprasert, Chattiya; Trachunthong, Deondara; Kerr, Stephen J; Janyam, Surang; Linjongrat, Danai; Hightow-Weidman, Lisa B; Phanuphak, Praphan; Ananworanich, Jintanat; Phanuphak, Nittaya

    2017-03-13

    PrEP awareness and uptake among men who have sex with men (MSM) and transgender women (TG) in Thailand remains low. Finding ways to increase HIV testing and PrEP uptake among high-risk groups is a critical priority. This study evaluates the effect of a novel Adam's Love Online-to-Offline (O2O) model on PrEP and HIV testing uptake among Thai MSM and TG and identifies factors associated with PrEP uptake. The O2O model was piloted by Adam's Love (www.adamslove.org) HIV educational and counselling website. MSM and TG reached online by PrEP promotions and interested in free PrEP and/or HIV testing services contacted Adam's Love online staff, received real-time PrEP eCounseling, and completed online bookings for receiving services at one of the four sites in Bangkok based on their preference. Auto-generated site- and service-specific e-tickets and Quick Response (QR) codes were sent to their mobile devices enabling monitoring and check-in by offline site staff. Service uptake and participant's socio-demographic and risk behaviour characteristics were analyzed. Factors associated with PrEP uptake were assessed using multiple logistic regression. Between January 10th and April 11th, 2016, Adam's Love reached 272,568 people online via the PrEP O2O promotions. 425 MSM and TG received eCounseling and e-tickets. There were 325 (76.5%) MSM and TG who checked-in at clinics and received HIV testing. Nine (2.8%) were diagnosed with HIV infection. Median (IQR) time between receiving the e-ticket and checking-in was 3 (0-7) days. Of 316 HIV-negative MSM and TG, 168 (53.2%) started PrEP. In a multivariate model, higher education (OR 2.30, 95%CI 1.14-4.66; p  = 0.02), seeking sex partners online (OR 2.05, 95%CI 1.19-3.54; p  = 0.009), being aware of sexual partners' HIV status (OR 2.37, 95%CI 1.29-4.35; p  = 0.008), ever previously using post-exposure prophylaxis (PEP) (OR 2.46, 95%CI 1.19-5.09; p  = 0.01), and enrolment at Adam's Love clinic compared to the other three sites (OR 3.79, 95%CI 2.06-6.95; p  < 0.001) were independently associated with PrEP uptake. Adam's Love O2O model is highly effective in linking online at-risk MSM and TG to PrEP and HIV testing services, and has high potential to be replicated and scaled up in other settings with high Internet penetration among key populations.

  15. Red Blood Cell Agglutination for Blood Typing Within Passive Microfluidic Biochips.

    PubMed

    Huet, Maxime; Cubizolles, Myriam; Buhot, Arnaud

    2018-04-19

    Pre-transfusion bedside compatibility test is mandatory to check that the donor and the recipient present compatible groups before any transfusion is performed. Although blood typing devices are present on the market, they still suffer from various drawbacks, like results that are based on naked-eye observation or difficulties in blood handling and process automation. In this study, we addressed the development of a red blood cells (RBC) agglutination assay for point-of-care blood typing. An injection molded microfluidic chip that is designed to enhance capillary flow contained anti-A or anti-B dried reagents inside its microchannel. The only blood handling step in the assay protocol consisted in the deposit of a blood drop at the tip of the biochip, and imaging was then achieved. The embedded reagents were able to trigger RBC agglutination in situ, allowing for us to monitor in real time the whole process. An image processing algorithm was developed on diluted bloods to compute real-time agglutination indicator and was further validated on undiluted blood. Through this proof of concept, we achieved efficient, automated, real time, and quantitative measurement of agglutination inside a passive biochip for blood typing which could be further generalized to blood biomarker detection and quantification.

  16. Cross-Dependency Inference in Multi-Layered Networks: A Collaborative Filtering Perspective.

    PubMed

    Chen, Chen; Tong, Hanghang; Xie, Lei; Ying, Lei; He, Qing

    2017-08-01

    The increasingly connected world has catalyzed the fusion of networks from different domains, which facilitates the emergence of a new network model-multi-layered networks. Examples of such kind of network systems include critical infrastructure networks, biological systems, organization-level collaborations, cross-platform e-commerce, and so forth. One crucial structure that distances multi-layered network from other network models is its cross-layer dependency, which describes the associations between the nodes from different layers. Needless to say, the cross-layer dependency in the network plays an essential role in many data mining applications like system robustness analysis and complex network control. However, it remains a daunting task to know the exact dependency relationships due to noise, limited accessibility, and so forth. In this article, we tackle the cross-layer dependency inference problem by modeling it as a collective collaborative filtering problem. Based on this idea, we propose an effective algorithm Fascinate that can reveal unobserved dependencies with linear complexity. Moreover, we derive Fascinate-ZERO, an online variant of Fascinate that can respond to a newly added node timely by checking its neighborhood dependencies. We perform extensive evaluations on real datasets to substantiate the superiority of our proposed approaches.

  17. A high fidelity real-time simulation of a small turboshaft engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1988-01-01

    A high-fidelity component-type model and real-time digital simulation of the General Electric T700-GE-700 turboshaft engine were developed for use with current generation real-time blade-element rotor helicopter simulations. A control system model based on the specification fuel control system used in the UH-60A Black Hawk helicopter is also presented. The modeling assumptions and real-time digital implementation methods particular to the simulation of small turboshaft engines are described. The validity of the simulation is demonstrated by comparison with analysis-oriented simulations developed by the manufacturer, available test data, and flight-test time histories.

  18. Checking in: Location Services for Libraries

    ERIC Educational Resources Information Center

    Rethlefsen, Melissa L.

    2010-01-01

    As always in real estate, everything in technology these days seems to be about location. From Google's recent addition of a "nearby" location limit to its suite of search options to the spate of location-based mobile apps and social networks, physical proximity has become the linchpin of a trend to connect technology to the real world. In this…

  19. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  20. Model Checking Abstract PLEXIL Programs with SMART

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.

    2007-01-01

    We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.

  1. Verifying Multi-Agent Systems via Unbounded Model Checking

    NASA Technical Reports Server (NTRS)

    Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.

    2004-01-01

    We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems

  2. Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited)

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Bock, Y.; Squibb, M. B.

    2010-12-01

    Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.

  3. Network Reduction Algorithm for Developing Distribution Feeders for Real-Time Simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Nelson, Austin A; Prabakar, Kumaraguru

    As advanced grid-support functions (AGF) become more widely used in grid-connected photovoltaic (PV) inverters, utilities are increasingly interested in their impacts when implemented in the field. These effects can be understood by modeling feeders in real-time simulators and test PV inverters using power hardware-in-the-loop (PHIL) techniques. This paper presents a novel feeder model reduction algorithm using a ruin & reconstruct methodology that enables large feeders to be solved and operated on real-time computing platforms. Two Hawaiian Electric feeder models in Synergi Electric's load flow software were converted to reduced order models in OpenDSS, and subsequently implemented in the OPAL-RT real-timemore » digital testing platform. Smart PV inverters were added to the realtime model with AGF responses modeled after characterizing commercially available hardware inverters. Finally, hardware inverters were tested in conjunction with the real-time model using PHIL techniques so that the effects of AGFs on the feeders could be analyzed.« less

  4. Design, implementation and evaluation of an independent real-time safety layer for medical robotic systems using a force-torque-acceleration (FTA) sensor.

    PubMed

    Richter, Lars; Bruder, Ralf

    2013-05-01

    Most medical robotic systems require direct interaction or contact with the robot. Force-Torque (FT) sensors can easily be mounted to the robot to control the contact pressure. However, evaluation is often done in software, which leads to latencies. To overcome that, we developed an independent safety system, named FTA sensor, which is based on an FT sensor and an accelerometer. An embedded system (ES) runs a real-time monitoring system for continuously checking of the readings. In case of a collision or error, it instantaneously stops the robot via the robot's external emergency stop. We found that the ES implementing the FTA sensor has a maximum latency of [Formula: see text] ms to trigger the robot's emergency stop. For the standard settings in the application of robotized transcranial magnetic stimulation, the robot will stop after at most 4 mm. Therefore, it works as an independent safety layer preventing patient and/or operator from serious harm.

  5. Earth Observation Data Quality Monitoring and Control: A Case Study of STAR Central Data Repository

    NASA Astrophysics Data System (ADS)

    Han, W.; Jochum, M.

    2017-12-01

    Earth observation data quality is very important for researchers and decision makers involved in weather forecasting, severe weather warning, disaster and emergency response, environmental monitoring, etc. Monitoring and control earth observation data quality, especially accuracy, completeness, and timeliness, is very useful in data management and governance to optimize data flow, discover potential transmission issues, and better connect data providers and users. Taking a centralized near real-time satellite data repository, STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR), as an example, this paper describes how to develop new mechanism to verify data integrity, check data completeness, and monitor data latency in an operational data management system. Such quality monitoring and control of large volume satellite data help data providers and managers improve data transmission of near real-time satellite data, enhance its acquisition and management, and overcome performance and management issues to better serve research and development activities.

  6. Integrated active mixing and biosensing using low frequency vibrating mixer and Love-wave sensor for real time detection of antibody binding event

    NASA Astrophysics Data System (ADS)

    Kardous, F.; El Fissi, L.; Friedt, J.-M.; Bastien, F.; Boireau, W.; Yahiaoui, R.; Manceau, J.-F.; Ballandras, S.

    2011-05-01

    The development of lab-on-chip devices is expected to dramatically change biochemical analyses, allowing for a notable increase of processing quality and throughput, provided the induced chemical reactions are well controlled. In this work, we investigate the impact of local acoustic mixing to promote or accelerate such biochemical reactions, such as antibody grafting on activated surfaces. During microarray building, the spotting mode leads to low efficiency in the ligand grafting and heterogeneities which limits its performances. To improve the transfer rate, we induce a hydrodynamic flow in the spotted droplet to disrupt the steady state during antibody grafting. To prove that acoustic mixing increases the antibody transfer rate to the biochip surface, we have used a Love-wave sensor allowing for real-time monitoring of the biological reaction for different operating conditions (with or without mixing). An analysis of the impact of the proposed mixing on grafting kinetics is proposed and finally checked in the case of antibody-antigen combination.

  7. EasyLCMS: an asynchronous web application for the automated quantification of LC-MS data

    PubMed Central

    2012-01-01

    Background Downstream applications in metabolomics, as well as mathematical modelling, require data in a quantitative format, which may also necessitate the automated and simultaneous quantification of numerous metabolites. Although numerous applications have been previously developed for metabolomics data handling, automated calibration and calculation of the concentrations in terms of μmol have not been carried out. Moreover, most of the metabolomics applications are designed for GC-MS, and would not be suitable for LC-MS, since in LC, the deviation in the retention time is not linear, which is not taken into account in these applications. Moreover, only a few are web-based applications, which could improve stand-alone software in terms of compatibility, sharing capabilities and hardware requirements, even though a strong bandwidth is required. Furthermore, none of these incorporate asynchronous communication to allow real-time interaction with pre-processed results. Findings Here, we present EasyLCMS (http://www.easylcms.es/), a new application for automated quantification which was validated using more than 1000 concentration comparisons in real samples with manual operation. The results showed that only 1% of the quantifications presented a relative error higher than 15%. Using clustering analysis, the metabolites with the highest relative error distributions were identified and studied to solve recurrent mistakes. Conclusions EasyLCMS is a new web application designed to quantify numerous metabolites, simultaneously integrating LC distortions and asynchronous web technology to present a visual interface with dynamic interaction which allows checking and correction of LC-MS raw data pre-processing results. Moreover, quantified data obtained with EasyLCMS are fully compatible with numerous downstream applications, as well as for mathematical modelling in the systems biology field. PMID:22884039

  8. Accelerated test system strength models based on Birnbaum-Saunders distribution: a complete Bayesian analysis and comparison.

    PubMed

    Upadhyay, S K; Mukherjee, Bhaswati; Gupta, Ashutosh

    2009-09-01

    Several models for studies related to tensile strength of materials are proposed in the literature where the size or length component has been taken to be an important factor for studying the specimens' failure behaviour. An important model, developed on the basis of cumulative damage approach, is the three-parameter extension of the Birnbaum-Saunders fatigue model that incorporates size of the specimen as an additional variable. This model is a strong competitor of the commonly used Weibull model and stands better than the traditional models, which do not incorporate the size effect. The paper considers two such cumulative damage models, checks their compatibility with a real dataset, compares them with some of the recent toolkits, and finally recommends a model, which appears an appropriate one. Throughout the study is Bayesian based on Markov chain Monte Carlo simulation.

  9. Real time control “es-dawet” mixer using dasboard based on PLC and WSN

    NASA Astrophysics Data System (ADS)

    Siagian, Pandapotan; Hutauruk, Sindak; Kisno

    2017-09-01

    The aim of this study is to monitor and acquire the remote parameters like Speed control a DC Motor, IR Sensor, Temperature of pasteurize mix of ice cream, and send these real values over wireless network. A proposed system is dashboard monitoring system for PLC based system wirelessly using ZigBee protocol. To implement this a ZigBee model is connected to a programmed digital signal controller which would transmit the data to Zigbee coordinator which is connected to a PC through RS232 serial communication. Person can need only to send the reply about the process that is to be carried out and PLC will check the status of the web base sent by person and take the action according to it where wired communication is either more expensive or impossible due to physical conditions. A low cost system for measured the parameters of motor such as IR Sensor, Speed control a DC Motor by PWM and temperature with Zigbee protocol connectivity. A database is built to execute monitoring and to save the motor parameters received by radio frequency (RF) data acquisition system. Experimental results show that the proposed system is less costly, provides higher accuracy as well as safe and gives visual environment.

  10. High-precision real-time 3D shape measurement based on a quad-camera system

    NASA Astrophysics Data System (ADS)

    Tao, Tianyang; Chen, Qian; Feng, Shijie; Hu, Yan; Zhang, Minliang; Zuo, Chao

    2018-01-01

    Phase-shifting profilometry (PSP) based 3D shape measurement is well established in various applications due to its high accuracy, simple implementation, and robustness to environmental illumination and surface texture. In PSP, higher depth resolution generally requires higher fringe density of projected patterns which, in turn, lead to severe phase ambiguities that must be solved with additional information from phase coding and/or geometric constraints. However, in order to guarantee the reliability of phase unwrapping, available techniques are usually accompanied by increased number of patterns, reduced amplitude of fringe, and complicated post-processing algorithms. In this work, we demonstrate that by using a quad-camera multi-view fringe projection system and carefully arranging the relative spatial positions between the cameras and the projector, it becomes possible to completely eliminate the phase ambiguities in conventional three-step PSP patterns with high-fringe-density without projecting any additional patterns or embedding any auxiliary signals. Benefiting from the position-optimized quad-camera system, stereo phase unwrapping can be efficiently and reliably performed by flexible phase consistency checks. Besides, redundant information of multiple phase consistency checks is fully used through a weighted phase difference scheme to further enhance the reliability of phase unwrapping. This paper explains the 3D measurement principle and the basic design of quad-camera system, and finally demonstrates that in a large measurement volume of 200 mm × 200 mm × 400 mm, the resultant dynamic 3D sensing system can realize real-time 3D reconstruction at 60 frames per second with a depth precision of 50 μm.

  11. SHARD - a SeisComP3 module for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Weber, B.; Becker, J.; Ellguth, E.; Henneberger, R.; Herrnkind, S.; Roessler, D.

    2016-12-01

    Monitoring building and structure response to strong earthquake ground shaking or human-induced vibrations in real-time forms the backbone of modern structural health monitoring (SHM). The continuous data transmission, processing and analysis reduces drastically the time decision makers need to plan for appropriate response to possible damages of high-priority buildings and structures. SHARD is a web browser based module using the SeisComp3 framework to monitor the structural health of buildings and other structures by calculating standard engineering seismology parameters and checking their exceedance in real-time. Thresholds can be defined, e.g. compliant with national building codes (IBC2000, DIN4149 or EC8), for PGA/PGV/PGD, response spectra and drift ratios. In case thresholds are exceeded automatic or operator driven reports are generated and send to the decision makers. SHARD also determines waveform quality in terms of data delay and variance to report sensor status. SHARD is the perfect tool for civil protection to monitor simultaneously multiple city-wide critical infrastructure as hospitals, schools, governmental buildings and structures as bridges, dams and power substations.

  12. Real-Time Evaluation of Breast Self-Examination Using Computer Vision

    PubMed Central

    Mohammadi, Eman; Dadios, Elmer P.; Gan Lim, Laurence A.; Cabatuan, Melvin K.; Naguib, Raouf N. G.; Avila, Jose Maria C.; Oikonomou, Andreas

    2014-01-01

    Breast cancer is the most common cancer among women worldwide and breast self-examination (BSE) is considered as the most cost-effective approach for early breast cancer detection. The general objective of this paper is to design and develop a computer vision algorithm to evaluate the BSE performance in real-time. The first stage of the algorithm presents a method for detecting and tracking the nipples in frames while a woman performs BSE; the second stage presents a method for localizing the breast region and blocks of pixels related to palpation of the breast, and the third stage focuses on detecting the palpated blocks in the breast region. The palpated blocks are highlighted at the time of BSE performance. In a correct BSE performance, all blocks must be palpated, checked, and highlighted, respectively. If any abnormality, such as masses, is detected, then this must be reported to a doctor to confirm the presence of this abnormality and proceed to perform other confirmatory tests. The experimental results have shown that the BSE evaluation algorithm presented in this paper provides robust performance. PMID:25435860

  13. Real-time evaluation of breast self-examination using computer vision.

    PubMed

    Mohammadi, Eman; Dadios, Elmer P; Gan Lim, Laurence A; Cabatuan, Melvin K; Naguib, Raouf N G; Avila, Jose Maria C; Oikonomou, Andreas

    2014-01-01

    Breast cancer is the most common cancer among women worldwide and breast self-examination (BSE) is considered as the most cost-effective approach for early breast cancer detection. The general objective of this paper is to design and develop a computer vision algorithm to evaluate the BSE performance in real-time. The first stage of the algorithm presents a method for detecting and tracking the nipples in frames while a woman performs BSE; the second stage presents a method for localizing the breast region and blocks of pixels related to palpation of the breast, and the third stage focuses on detecting the palpated blocks in the breast region. The palpated blocks are highlighted at the time of BSE performance. In a correct BSE performance, all blocks must be palpated, checked, and highlighted, respectively. If any abnormality, such as masses, is detected, then this must be reported to a doctor to confirm the presence of this abnormality and proceed to perform other confirmatory tests. The experimental results have shown that the BSE evaluation algorithm presented in this paper provides robust performance.

  14. Weekly Checks Improve Real-Time Prehospital ECG Transmission in Suspected STEMI.

    PubMed

    D'Arcy, Nicole T; Bosson, Nichole; Kaji, Amy H; Bui, Quang T; French, William J; Thomas, Joseph L; Elizarraraz, Yvonne; Gonzalez, Natalia; Garcia, Jose; Niemann, James T

    2018-06-01

    IntroductionField identification of ST-elevation myocardial infarction (STEMI) and advanced hospital notification decreases first-medical-contact-to-balloon (FMC2B) time. A recent study in this system found that electrocardiogram (ECG) transmission following a STEMI alert was frequently unsuccessful.HypothesisInstituting weekly test ECG transmissions from paramedic units to the hospital would increase successful transmission of ECGs and decrease FMC2B and door-to-balloon (D2B) times. This was a natural experiment of consecutive patients with field-identified STEMI transported to a single percutaneous coronary intervention (PCI)-capable hospital in a regional STEMI system before and after implementation of scheduled test ECG transmissions. In November 2014, paramedic units began weekly test transmissions. The mobile intensive care nurse (MICN) confirmed the transmission, or if not received, contacted the paramedic unit and the department's nurse educator to identify and resolve the problem. Per system-wide protocol, paramedics transmit all ECGs with interpretation of STEMI. Receiving hospitals submit patient data to a single registry as part of ongoing system quality improvement. The frequency of successful ECG transmission and time to intervention (FMC2B and D2B times) in the 18 months following implementation was compared to the 10 months prior. Post-implementation, the time the ECG transmission was received was also collected to determine the transmission gap time (time from ECG acquisition to ECG transmission received) and the advanced notification time (time from ECG transmission received to patient arrival). There were 388 patients with field ECG interpretations of STEMI, 131 pre-intervention and 257 post-intervention. The frequency of successful transmission post-intervention was 73% compared to 64% prior; risk difference (RD)=9%; 95% CI, 1-18%. In the post-intervention period, the median FMC2B time was 79 minutes (inter-quartile range [IQR]=68-102) versus 86 minutes (IQR=71-108) pre-intervention (P=.3) and the median D2B time was 59 minutes (IQR=44-74) versus 60 minutes (IQR=53-88) pre-intervention (P=.2). The median transmission gap was three minutes (IQR=1-8) and median advanced notification time was 16 minutes (IQR=10-25). Implementation of weekly test ECG transmissions was associated with improvement in successful real-time transmissions from field to hospital, which provided a median advanced notification time of 16 minutes, but no decrease in FMC2B or D2B times. D'ArcyNT, BossonN, KajiAH, BuiQT, FrenchWJ, ThomasJL, ElizarrarazY, GonzalezN, GarciaJ, NiemannJT. Weekly checks improve real-time prehospital ECG transmission in suspected STEMI. Prehosp Disaster Med. 2018;33(3):245-249.

  15. Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media

    PubMed Central

    Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang

    2016-01-01

    Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users’ spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last. PMID:27999398

  16. Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media.

    PubMed

    Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang

    2016-12-20

    Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users' spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last.

  17. "It's about Improving My Practice": The Learner Experience of Real-Time Coaching

    ERIC Educational Resources Information Center

    Sharplin, Erica J.; Stahl, Garth; Kehrwald, Ben

    2016-01-01

    This article reports on pre-service teachers' experience of the Real-Time Coaching model, an innovative technology-based approach to teacher training. The Real-Time Coaching model uses multiple feedback cycles via wireless technology to develop within pre-service teachers the specific skills and mindset toward continual improvement. Results of…

  18. [Model for unplanned self extubation of ICU patients using system dynamics approach].

    PubMed

    Song, Yu Gil; Yun, Eun Kyoung

    2015-04-01

    In this study a system dynamics methodology was used to identify correlation and nonlinear feedback structure among factors affecting unplanned extubation (UE) of ICU patients and to construct and verify a simulation model. Factors affecting UE were identified through a theoretical background established by reviewing literature and preceding studies and referencing various statistical data. Related variables were decided through verification of content validity by an expert group. A causal loop diagram (CLD) was made based on the variables. Stock & Flow modeling using Vensim PLE Plus Version 6.0 b was performed to establish a model for UE. Based on the literature review and expert verification, 18 variables associated with UE were identified and CLD was prepared. From the prepared CLD, a model was developed by converting to the Stock & Flow Diagram. Results of the simulation showed that patient stress, patient in an agitated state, restraint application, patient movability, and individual intensive nursing were variables giving the greatest effect to UE probability. To verify agreement of the UE model with real situations, simulation with 5 cases was performed. Equation check and sensitivity analysis on TIME STEP were executed to validate model integrity. Results show that identification of a proper model enables prediction of UE probability. This prediction allows for adjustment of related factors, and provides basic data do develop nursing interventions to decrease UE.

  19. Asymptotic Stability of Interconnected Passive Non-Linear Systems

    NASA Technical Reports Server (NTRS)

    Isidori, A.; Joshi, S. M.; Kelkar, A. G.

    1999-01-01

    This paper addresses the problem of stabilization of a class of internally passive non-linear time-invariant dynamic systems. A class of non-linear marginally strictly passive (MSP) systems is defined, which is less restrictive than input-strictly passive systems. It is shown that the interconnection of a non-linear passive system and a non-linear MSP system is globally asymptotically stable. The result generalizes and weakens the conditions of the passivity theorem, which requires one of the systems to be input-strictly passive. In the case of linear time-invariant systems, it is shown that the MSP property is equivalent to the marginally strictly positive real (MSPR) property, which is much simpler to check.

  20. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  1. Scaling in the Donangelo-Sneppen model for evolution of money

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich; P. Radomski, Jan

    2001-03-01

    The evolution of money from unsuccessful barter attempts, as modeled by Donangelo and Sneppen, is modified by a deterministic instead of a probabilistic selection of the most desired product as money. We check in particular the characteristic times of the model as a function of system size.

  2. Application of real rock pore-threat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakibul, M.; Sarker, H.; McIntyre, D.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data.« less

  3. Application of real rock pore-throat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, M.R.; McIntyre, D.; Ferer, M.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data. Introduction« less

  4. Implementing Model-Check for Employee and Management Satisfaction

    NASA Technical Reports Server (NTRS)

    Jones, Corey; LaPha, Steven

    2013-01-01

    This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.

  5. Real-Time Simulation of the X-33 Aerospace Engine

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert

    1999-01-01

    This paper discusses the development and performance of the X-33 Aerospike Engine RealTime Model. This model was developed for the purposes of control law development, six degree-of-freedom trajectory analysis, vehicle system integration testing, and hardware-in-the loop controller verification. The Real-Time Model uses time-step marching solution of non-linear differential equations representing the physical processes involved in the operation of a liquid propellant rocket engine, albeit in a simplified form. These processes include heat transfer, fluid dynamics, combustion, and turbomachine performance. Two engine models are typically employed in order to accurately model maneuvering and the powerpack-out condition where the power section of one engine is used to supply propellants to both engines if one engine malfunctions. The X-33 Real-Time Model is compared to actual hot fire test data and is been found to be in good agreement.

  6. Application of troposphere model from NWP and GNSS data into real-time precise positioning

    NASA Astrophysics Data System (ADS)

    Wilgan, Karina; Hadas, Tomasz; Kazmierski, Kamil; Rohm, Witold; Bosy, Jaroslaw

    2016-04-01

    The tropospheric delay empirical models are usually functions of meteorological parameters (temperature, pressure and humidity). The application of standard atmosphere parameters or global models, such as GPT (global pressure/temperature) model or UNB3 (University of New Brunswick, version 3) model, may not be sufficient, especially for positioning in non-standard weather conditions. The possible solution is to use regional troposphere models based on real-time or near-real time measurements. We implement a regional troposphere model into the PPP (Precise Point Positioning) software GNSS-WARP (Wroclaw Algorithms for Real-time Positioning) developed at Wroclaw University of Environmental and Life Sciences. The software is capable of processing static and kinematic multi-GNSS data in real-time and post-processing mode and takes advantage of final IGS (International GNSS Service) products as well as IGS RTS (Real-Time Service) products. A shortcoming of PPP technique is the time required for the solution to converge. One of the reasons is the high correlation among the estimated parameters: troposphere delay, receiver clock offset and receiver height. To efficiently decorrelate these parameters, a significant change in satellite geometry is required. Alternative solution is to introduce the external high-quality regional troposphere delay model to constrain troposphere estimates. The proposed model consists of zenith total delays (ZTD) and mapping functions calculated from meteorological parameters from Numerical Weather Prediction model WRF (Weather Research and Forecasting) and ZTDs from ground-based GNSS stations using the least-squares collocation software COMEDIE (Collocation of Meteorological Data for Interpretation and Estimation of Tropospheric Pathdelays) developed at ETH Zurich.

  7. A chest drainage system with a real-time pressure monitoring device

    PubMed Central

    Liu, Tsang-Pai; Huang, Tung-Sung; Liu, Hung-Chang; Chen, Chao-Hung

    2015-01-01

    Background Tube thoracostomy is a common procedure. A chest bottle may be used to both collect fluids and monitor the recovery of the chest condition. The presence of the “tidaling phenomenon” in the bottle can be reflective of the extent of patient’s recovery. Objectives However, current practice essentially depends on gross observation of the bottle. The device used here is designed for a real-time monitoring of change in pleural pressure to allow clinicians to objectively determine when the lung has recovered, which is crucially important in order to judge when to remove the chest tube. Methods The device is made of a pressure sensor with an operating range between −100 to +100 cmH2O and an amplifying using the “Wheatstone bridge” concept. Recording and analysis was performed with LABview software. The data can be shown in real-time on screen and also be checked retrospectively. The device was connected to the second part of a three-bottle drain system by a three-way connector. Results The test animals were two 40-kg pigs. We used a thoracoscopic procedure to create an artificial lung laceration with endoscopic scissors. Active air leaks could result in vigorous tidaling phenomenon up to 20 cmH2O. In the absence of gross tidaling phenomenon, the pressure changes were around 0.25 cmH2O. Conclusions This real-time pleural pressure monitoring device can help clinicians objectively judge the extent of recovery of the chest condition. It can be used as an effective adjunct with the current chest drain system. PMID:26380726

  8. The new version 2.12 of BKG Ntrip Client (BNC)

    NASA Astrophysics Data System (ADS)

    Stürze, Andrea; Mervart, Leos; Weber, Georg; Rülke, Axel; Wiesensarter, Erwin; Neumaier, Peter

    2016-04-01

    A new version of the BKG Ntrip Client (BNC) has been released. Originally developed in cooperation of the Federal Agency for Cartography and Geodesy (BKG) and the Czech Technical University (CTU) with a focus on multi-stream real-time access to GPS observations, the software has once again been substantially extended. Promoting Open Standards as recommended by the Radio Technical Commission for Maritime Services (RTCM) remains the prime subject. Beside its Graphical User Interface (GUI), the real-time software for Windows, Linux, Mac, and Linux platforms now comes with complete Command Line Interface (CLI) and considerable post processing functionality. RINEX Version 3 file editing & Quality Check (QC) with full support of Galileo, BeiDou, and SBAS - besides GPS and GLONASS - is part of the new features. Comparison of satellite orbit/clock files in SP3 format is another fresh ability of BNC. Simultaneous multi-station Precise Point Positioning (PPP) for real-time displacement-monitoring of entire reference station networks is one more recent addition to BNC. Implemented RTCM messages for PPP (under development) comprise satellite orbit and clock corrections, code and phase observation biases, and the Vertical Total Electron Content (VTEC) of the ionosphere. The well established, mature codebase is mostly written in C++ language. Its publication under GNU GPL is thought to be well-suited for test, validation and demonstration of new approaches in precise real-time satellite navigation when IP streaming is involved. The poster highlights BNC features which are new in version 2.12 and beneficial to IAG institutions and services such as IGS/RT-IGS and to the interested public in general.

  9. A chest drainage system with a real-time pressure monitoring device.

    PubMed

    Chen, Chih-Hao; Liu, Tsang-Pai; Chang, Ho; Huang, Tung-Sung; Liu, Hung-Chang; Chen, Chao-Hung

    2015-07-01

    Tube thoracostomy is a common procedure. A chest bottle may be used to both collect fluids and monitor the recovery of the chest condition. The presence of the "tidaling phenomenon" in the bottle can be reflective of the extent of patient's recovery. However, current practice essentially depends on gross observation of the bottle. The device used here is designed for a real-time monitoring of change in pleural pressure to allow clinicians to objectively determine when the lung has recovered, which is crucially important in order to judge when to remove the chest tube. The device is made of a pressure sensor with an operating range between -100 to +100 cmH2O and an amplifying using the "Wheatstone bridge" concept. Recording and analysis was performed with LABview software. The data can be shown in real-time on screen and also be checked retrospectively. The device was connected to the second part of a three-bottle drain system by a three-way connector. The test animals were two 40-kg pigs. We used a thoracoscopic procedure to create an artificial lung laceration with endoscopic scissors. Active air leaks could result in vigorous tidaling phenomenon up to 20 cmH2O. In the absence of gross tidaling phenomenon, the pressure changes were around 0.25 cmH2O. This real-time pleural pressure monitoring device can help clinicians objectively judge the extent of recovery of the chest condition. It can be used as an effective adjunct with the current chest drain system.

  10. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  11. High-resolution urban observation network for user-specific meteorological information service in the Seoul Metropolitan Area, South Korea

    NASA Astrophysics Data System (ADS)

    Park, Moon-Soo; Park, Sung-Hwa; Chae, Jung-Hoon; Choi, Min-Hyeok; Song, Yunyoung; Kang, Minsoo; Roh, Joon-Woo

    2017-04-01

    To improve our knowledge of urban meteorology, including those processes applicable to high-resolution meteorological models in the Seoul Metropolitan Area (SMA), the Weather Information Service Engine (WISE) Urban Meteorological Observation System (UMS-Seoul) has been designed and installed. The UMS-Seoul incorporates 14 surface energy balance (EB) systems, 7 surface-based three-dimensional (3-D) meteorological observation systems and applied meteorological (AP) observation systems, and the existing surface-based meteorological observation network. The EB system consists of a radiation balance system, sonic anemometers, infrared CO2/H2O gas analyzers, and many sensors measuring the wind speed and direction, temperature and humidity, precipitation, and air pressure. The EB-produced radiation, meteorological, and turbulence data will be used to quantify the surface EB according to land use and to improve the boundary-layer and surface processes in meteorological models. The 3-D system, composed of a wind lidar, microwave radiometer, aerosol lidar, or ceilometer, produces the cloud height, vertical profiles of backscatter by aerosols, wind speed and direction, temperature, humidity, and liquid water content. It will be used for high-resolution reanalysis data based on observations and for the improvement of the boundary-layer, radiation, and microphysics processes in meteorological models. The AP system includes road weather information, mosquito activity, water quality, and agrometeorological observation instruments. The standardized metadata for networks and stations are documented and renewed periodically to provide a detailed observation environment. The UMS-Seoul data are designed to support real-time acquisition and display and automatically quality check within 10 min from observation. After the quality check, data can be distributed to relevant potential users such as researchers and policy makers. Finally, two case studies demonstrate that the observed data have a great potential to help to understand the boundary-layer structures more deeply, improve the performance of high-resolution meteorological models, and provide useful information customized based on the user demands in the SMA.

  12. Combining Static Analysis and Model Checking for Software Analysis

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  13. Influence of Gsd for 3d City Modeling and Visualization from Aerial Imagery

    NASA Astrophysics Data System (ADS)

    Alrajhi, Muhamad; Alam, Zafare; Afroz Khan, Mohammad; Alobeid, Abdalla

    2016-06-01

    Ministry of Municipal and Rural Affairs (MOMRA), aims to establish solid infrastructure required for 3D city modelling, for decision making to set a mark in urban development. MOMRA is responsible for the large scale mapping 1:1,000; 1:2,500; 1:10,000 and 1:20,000 scales for 10cm, 20cm and 40 GSD with Aerial Triangulation data. As 3D city models are increasingly used for the presentation exploration, and evaluation of urban and architectural designs. Visualization capabilities and animations support of upcoming 3D geo-information technologies empower architects, urban planners, and authorities to visualize and analyze urban and architectural designs in the context of the existing situation. To make use of this possibility, first of all 3D city model has to be created for which MOMRA uses the Aerial Triangulation data and aerial imagery. The main concise for 3D city modelling in the Kingdom of Saudi Arabia exists due to uneven surface and undulations. Thus real time 3D visualization and interactive exploration support planning processes by providing multiple stakeholders such as decision maker, architects, urban planners, authorities, citizens or investors with a three - dimensional model. Apart from advanced visualization, these 3D city models can be helpful for dealing with natural hazards and provide various possibilities to deal with exotic conditions by better and advanced viewing technological infrastructure. Riyadh on one side is 5700m above sea level and on the other hand Abha city is 2300m, this uneven terrain represents a drastic change of surface in the Kingdom, for which 3D city models provide valuable solutions with all possible opportunities. In this research paper: influence of different GSD (Ground Sample Distance) aerial imagery with Aerial Triangulation is used for 3D visualization in different region of the Kingdom, to check which scale is more sophisticated for obtaining better results and is cost manageable, with GSD (7.5cm, 10cm, 20cm and 40cm). The comparison test is carried out in Bentley environment to check the best possible results obtained through operating different batch processes.

  14. Real-time dynamic simulation of the Cassini spacecraft using DARTS. Part 2: Parallel/vectorized real-time implementation

    NASA Technical Reports Server (NTRS)

    Fijany, A.; Roberts, J. A.; Jain, A.; Man, G. K.

    1993-01-01

    Part 1 of this paper presented the requirements for the real-time simulation of Cassini spacecraft along with some discussion of the DARTS algorithm. Here, in Part 2 we discuss the development and implementation of parallel/vectorized DARTS algorithm and architecture for real-time simulation. Development of the fast algorithms and architecture for real-time hardware-in-the-loop simulation of spacecraft dynamics is motivated by the fact that it represents a hard real-time problem, in the sense that the correctness of the simulation depends on both the numerical accuracy and the exact timing of the computation. For a given model fidelity, the computation should be computed within a predefined time period. Further reduction in computation time allows increasing the fidelity of the model (i.e., inclusion of more flexible modes) and the integration routine.

  15. Real-time visual simulation of APT system based on RTW and Vega

    NASA Astrophysics Data System (ADS)

    Xiong, Shuai; Fu, Chengyu; Tang, Tao

    2012-10-01

    The Matlab/Simulink simulation model of APT (acquisition, pointing and tracking) system is analyzed and established. Then the model's C code which can be used for real-time simulation is generated by RTW (Real-Time Workshop). Practical experiments show, the simulation result of running the C code is the same as running the Simulink model directly in the Matlab environment. MultiGen-Vega is a real-time 3D scene simulation software system. With it and OpenGL, the APT scene simulation platform is developed and used to render and display the virtual scenes of the APT system. To add some necessary graphics effects to the virtual scenes real-time, GLSL (OpenGL Shading Language) shaders are used based on programmable GPU. By calling the C code, the scene simulation platform can adjust the system parameters on-line and get APT system's real-time simulation data to drive the scenes. Practical application shows that this visual simulation platform has high efficiency, low charge and good simulation effect.

  16. The relationship of point-of-sale tobacco advertising and neighborhood characteristics to underage sales of tobacco.

    PubMed

    Widome, Rachel; Brock, Betsy; Noble, Petra; Forster, Jean L

    2012-09-01

    Our objective was to determine how point-of-sale tobacco marketing may relate to sales to minors. The authors used data from a 2007 cross-sectional study of the retail tobacco marketing environments in the St. Paul, MN metropolitan area matched with a database of age-of-sale compliance checks (random, covert test purchases by a minor, coordinated by law enforcement) of tobacco retailers and U.S. Census data to test whether certain characteristics of advertising or neighborhoods were associated with compliance check failure. The authors found that tobacco stores were the most likely type of store to fail compliance checks (44% failure), supermarkets were least likely (3%). Aside from a marginally significant association with Hispanic population proportion, there was no other association between either store advertising characteristics or neighborhood demographics and stores' compliance check failure. Though our findings were null, the relationship between advertising and real youth sales may be more nuanced as compliance checks do not perfectly simulate the way youth attempt to purchase cigarettes.

  17. Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.

    DTIC Science & Technology

    1996-04-01

    This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In

  18. Hot-bench simulation of the active flexible wing wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Houck, Jacob A.

    1990-01-01

    Two simulations, one batch and one real-time, of an aeroelastically-scaled wind-tunnel model were developed. The wind-tunnel model was a full-span, free-to-roll model of an advanced fighter concept. The batch simulation was used to generate and verify the real-time simulation and to test candidate control laws prior to implementation. The real-time simulation supported hot-bench testing of a digital controller, which was developed to actively control the elastic deformation of the wind-tunnel model. Time scaling was required for hot-bench testing. The wind-tunnel model, the mathematical models for the simulations, the techniques employed to reduce the hot-bench time-scale factors, and the verification procedures are described.

  19. Nowcast model for hazardous material spill prevention and response, San Francisco Bay, California

    USGS Publications Warehouse

    Cheng, Ralph T.; Wilmot, Wayne L.; Galt, Jerry A.

    1997-01-01

    The National Oceanic and Atmospheric Administration (NOAA) installed the Physical Oceanographic Real-time System (PORTS) in San Francisco Bay, California, to provide real-time observations of tides, tidal currents, and meteorological conditions to, among other purposes, guide hazardous material spill prevention and response. Integrated with nowcast modeling techniques and dissemination of real-time data and the nowcasting results through the Internet on the World Wide Web, emerging technologies used in PORTS for real-time data collection forms a nowcast modeling system. Users can download tides and tidal current distribution in San Francisco Bay for their specific applications and/or for further analysis.

  20. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  1. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... in different states or check processing regions)]. If you make the deposit in person to one of our...] Substitute Checks and Your Rights What Is a Substitute Check? To make check processing faster, federal law...

  2. Stability Calculation Method of Slope Reinforced by Prestressed Anchor in Process of Excavation

    PubMed Central

    Li, Zhong; Wei, Jia; Yang, Jun

    2014-01-01

    This paper takes the effect of supporting structure and anchor on the slope stability of the excavation process into consideration; the stability calculation model is presented for the slope reinforced by prestressed anchor and grillage beam, and the dynamic search model of the critical slip surface also is put forward. The calculation model of the optimal stability solution of each anchor tension of the whole process is also given out, through which the real-time analysis and checking of slope stability in the process of excavation can be realized. The calculation examples indicate that the slope stability is changed with the dynamic change of the design parameters of anchor and grillage beam. So it is relatively more accurate and reasonable by using dynamic search model to determine the critical slip surface of the slope reinforced by prestressed anchor and grillage beam. Through the relationships of each anchor layout and the slope height of various stages of excavation, and the optimal stability solution of prestressed bolt tension design value in various excavation stages can be obtained. The arrangement of its prestressed anchor force reflects that the layout of the lower part of bolt and the calculation of slope reinforcement is in line with the actual. These indicate that the method is reasonable and practical. PMID:24683319

  3. Stability calculation method of slope reinforced by prestressed anchor in process of excavation.

    PubMed

    Li, Zhong; Wei, Jia; Yang, Jun

    2014-01-01

    This paper takes the effect of supporting structure and anchor on the slope stability of the excavation process into consideration; the stability calculation model is presented for the slope reinforced by prestressed anchor and grillage beam, and the dynamic search model of the critical slip surface also is put forward. The calculation model of the optimal stability solution of each anchor tension of the whole process is also given out, through which the real-time analysis and checking of slope stability in the process of excavation can be realized. The calculation examples indicate that the slope stability is changed with the dynamic change of the design parameters of anchor and grillage beam. So it is relatively more accurate and reasonable by using dynamic search model to determine the critical slip surface of the slope reinforced by prestressed anchor and grillage beam. Through the relationships of each anchor layout and the slope height of various stages of excavation, and the optimal stability solution of prestressed bolt tension design value in various excavation stages can be obtained. The arrangement of its prestressed anchor force reflects that the layout of the lower part of bolt and the calculation of slope reinforcement is in line with the actual. These indicate that the method is reasonable and practical.

  4. Building flexible real-time systems using the Flex language

    NASA Technical Reports Server (NTRS)

    Kenny, Kevin B.; Lin, Kwei-Jay

    1991-01-01

    The design and implementation of a real-time programming language called Flex, which is a derivative of C++, are presented. It is shown how different types of timing requirements might be expressed and enforced in Flex, how they might be fulfilled in a flexible way using different program models, and how the programming environment can help in making binding and scheduling decisions. The timing constraint primitives in Flex are easy to use yet powerful enough to define both independent and relative timing constraints. Program models like imprecise computation and performance polymorphism can carry out flexible real-time programs. In addition, programmers can use a performance measurement tool that produces statistically correct timing models to predict the expected execution time of a program and to help make binding decisions. A real-time programming environment is also presented.

  5. Porting and refurbishment of the WSS TNG control software

    NASA Astrophysics Data System (ADS)

    Caproni, Alessandro; Zacchei, Andrea; Vuerli, Claudio; Pucillo, Mauro

    2004-09-01

    The Workstation Software Sytem (WSS) is the high level control software of the Italian Galileo Galilei Telescope settled in La Palma Canary Island developed at the beginning of '90 for HP-UX workstations. WSS may be seen as a middle layer software system that manages the communications between the real time systems (VME), different workstations and high level applications providing a uniform distributed environment. The project to port the control software from the HP workstation to Linux environment started at the end of 2001. It is aimed to refurbish the control software introducing some of the new software technologies and languages, available for free in the Linux operating system. The project was realized by gradually substituting each HP workstation with a Linux PC with the goal to avoid main changes in the original software running under HP-UX. Three main phases characterized the project: creation of a simulated control room with several Linux PCs running WSS (to check all the functionality); insertion in the simulated control room of some HPs (to check the mixed environment); substitution of HP workstation in the real control room. From a software point of view, the project introduces some new technologies, like multi-threading, and the possibility to develop high level WSS applications with almost every programming language that implements the Berkley sockets. A library to develop java applications has also been created and tested.

  6. Marketers approve check-off plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mantho, M.

    1996-02-01

    For the last several years, we have ended the year with a questionnaire looking into the future. Our panel was asked what was the greatest threat to our industry. Consistently, they have answered {open_quotes}the lack of real growth.{close_quotes} This year was no exception, but this time there was a feeling that something could and would be done. Almost every marketer (98%) said he or she supported the check-off program that would raise considerable funds for public relations and research. This idea isn`t new; it has been discussed year after year. Always the stumbling block was how to raise funds somore » that each company contributed and no one had a free ride. Last year, it was proposed that a commission created by federal law, be established to oversee funds for the education of consumers and government and provide technology research. If Congress authorizes this plan and members of the industry endorse it, we would have a strong program in place by 1997.« less

  7. V-Alert: Description and Validation of a Vulnerable Road User Alert System in the Framework of a Smart City.

    PubMed

    Hernandez-Jayo, Unai; De-la-Iglesia, Idoia; Perez, Jagoba

    2015-07-29

    V-Alert is a cooperative application to be deployed in the frame of Smart Cities with the aim of reducing the probability of accidents involving Vulnerable Road Users (VRU) and vehicles. The architecture of V-Alert combines short- and long-range communication technologies in order to provide more time to the drivers and VRU to take the appropriate maneuver and avoid a possible collision. The information generated by mobile sensors (vehicles and cyclists) is sent over this heterogeneous communication architecture and processed in a central server, the Drivers Cloud, which is in charge of generating the messages that are shown on the drivers' and cyclists' Human Machine Interface (HMI). First of all, V-Alert has been tested in a simulated scenario to check the communications architecture in a complex scenario and, once it was validated, all the elements of V-Alert have been moved to a real scenario to check the application reliability. All the results are shown along the length of this paper.

  8. On Real-Time Systems Using Local Area Networks.

    DTIC Science & Technology

    1987-07-01

    87-35 July, 1987 CS-TR-1892 On Real - Time Systems Using Local Area Networks*I VShem-Tov Levi Department of Computer Science Satish K. Tripathit...1892 On Real - Time Systems Using Local Area Networks* Shem-Tov Levi Department of Computer Science Satish K. Tripathit Department of Computer Science...constraints and the clock systems that feed the time to real - time systems . A model for real-time system based on LAN communication is presented in

  9. Bayesian model selection applied to artificial neural networks used for water resources modeling

    NASA Astrophysics Data System (ADS)

    Kingston, Greer B.; Maier, Holger R.; Lambert, Martin F.

    2008-04-01

    Artificial neural networks (ANNs) have proven to be extremely valuable tools in the field of water resources engineering. However, one of the most difficult tasks in developing an ANN is determining the optimum level of complexity required to model a given problem, as there is no formal systematic model selection method. This paper presents a Bayesian model selection (BMS) method for ANNs that provides an objective approach for comparing models of varying complexity in order to select the most appropriate ANN structure. The approach uses Markov Chain Monte Carlo posterior simulations to estimate the evidence in favor of competing models and, in this study, three known methods for doing this are compared in terms of their suitability for being incorporated into the proposed BMS framework for ANNs. However, it is acknowledged that it can be particularly difficult to accurately estimate the evidence of ANN models. Therefore, the proposed BMS approach for ANNs incorporates a further check of the evidence results by inspecting the marginal posterior distributions of the hidden-to-output layer weights, which unambiguously indicate any redundancies in the hidden layer nodes. The fact that this check is available is one of the greatest advantages of the proposed approach over conventional model selection methods, which do not provide such a test and instead rely on the modeler's subjective choice of selection criterion. The advantages of a total Bayesian approach to ANN development, including training and model selection, are demonstrated on two synthetic and one real world water resources case study.

  10. A meshless EFG-based algorithm for 3D deformable modeling of soft tissue in real-time.

    PubMed

    Abdi, Elahe; Farahmand, Farzam; Durali, Mohammad

    2012-01-01

    The meshless element-free Galerkin method was generalized and an algorithm was developed for 3D dynamic modeling of deformable bodies in real time. The efficacy of the algorithm was investigated in a 3D linear viscoelastic model of human spleen subjected to a time-varying compressive force exerted by a surgical grasper. The model remained stable in spite of the considerably large deformations occurred. There was a good agreement between the results and those of an equivalent finite element model. The computational cost, however, was much lower, enabling the proposed algorithm to be effectively used in real-time applications.

  11. [Synchronous playing and acquiring of heart sounds and electrocardiogram based on labVIEW].

    PubMed

    Dan, Chunmei; He, Wei; Zhou, Jing; Que, Xiaosheng

    2008-12-01

    In this paper is described a comprehensive system, which can acquire heart sounds and electrocardiogram (ECG) in parallel, synchronize the display; and play of heart sound and make auscultation and check phonocardiogram to tie in. The hardware system with C8051F340 as the core acquires the heart sound and ECG synchronously, and then sends them to indicators, respectively. Heart sounds are displayed and played simultaneously by controlling the moment of writing to indicator and sound output device. In clinical testing, heart sounds can be successfully located with ECG and real-time played.

  12. [Indications for laparoscopy in an internal medicine department in Dakar as indicated by echotomography].

    PubMed

    Aubry, P; Vergne, R; Oddes, B; Delanoue, G; Larregle, B; Seurat, P L

    1984-01-01

    A real time ultrasonography was set up in a senegalese hospital, resulting in a decrease of laparoscopy indications. Laparoscopy is given up for the diagnosis of liver abcess, jaundice and "abdominal masses". It must no more be included in the first step check up for hepatocellular carcinoma, because ultrasonography and cytology after puncture are enough to confirm the diagnosis. Laparoscopy remains essential for peritoneal diseases. Hepatic needle biopsy under laparoscopy control remains necessary to ensure with certainty the diagnosis of cirrhosis and especially chronic hepatitis, provided that no countraindications are found.

  13. Enhancements to the EPANET-RTX (Real-Time Analytics) ...

    EPA Pesticide Factsheets

    Technical brief and software The U.S. Environmental Protection Agency (EPA) developed EPANET-RTX as a collection of object-oriented software libraries comprising the core data access, data transformation, and data synthesis (real-time analytics) components of a real-time hydraulic and water quality modeling system. While EPANET-RTX uses the hydraulic and water quality solvers of EPANET, the object libraries are a self-contained set of building blocks for software developers. “Real-time EPANET” promises to change the way water utilities, commercial vendors, engineers, and the water community think about modeling.

  14. A silver lining? The connection between gasoline prices and obesity.

    PubMed

    Courtemanche, Charles

    2011-01-01

    I find evidence of a negative association between gasoline prices and body weight using a fixed effects model with several robustness checks. I also show that increases in gas prices are associated with additional walking and a reduction in the frequency with which people eat at restaurants, explaining their effect on weight. My estimates imply that 8% of the rise in obesity between 1979 and 2004 can be attributed to the concurrent drop in real gas prices, and that a permanent $1 increase in gasoline prices would reduce overweight and obesity in the United States by 7% and 10%.

  15. Utilization of DIRSIG in support of real-time infrared scene generation

    NASA Astrophysics Data System (ADS)

    Sanders, Jeffrey S.; Brown, Scott D.

    2000-07-01

    Real-time infrared scene generation for hardware-in-the-loop has been a traditionally difficult challenge. Infrared scenes are usually generated using commercial hardware that was not designed to properly handle the thermal and environmental physics involved. Real-time infrared scenes typically lack details that are included in scenes rendered in no-real- time by ray-tracing programs such as the Digital Imaging and Remote Sensing Scene Generation (DIRSIG) program. However, executing DIRSIG in real-time while retaining all the physics is beyond current computational capabilities for many applications. DIRSIG is a first principles-based synthetic image generation model that produces multi- or hyper-spectral images in the 0.3 to 20 micron region of the electromagnetic spectrum. The DIRSIG model is an integrated collection of independent first principles based on sub-models, each of which works in conjunction to produce radiance field images with high radiometric fidelity. DIRSIG uses the MODTRAN radiation propagation model for exo-atmospheric irradiance, emitted and scattered radiances (upwelled and downwelled) and path transmission predictions. This radiometry submodel utilizes bidirectional reflectance data, accounts for specular and diffuse background contributions, and features path length dependent extinction and emission for transmissive bodies (plumes, clouds, etc.) which may be present in any target, background or solar path. This detailed environmental modeling greatly enhances the number of rendered features and hence, the fidelity of a rendered scene. While DIRSIG itself cannot currently be executed in real-time, its outputs can be used to provide scene inputs for real-time scene generators. These inputs can incorporate significant features such as target to background thermal interactions, static background object thermal shadowing, and partially transmissive countermeasures. All of these features represent significant improvements over the current state of the art in real-time IR scene generation.

  16. A remote patient monitoring system using a Java-enabled 3G mobile phone.

    PubMed

    Zhang, Pu; Kogure, Yuichi; Matsuoka, Hiroki; Akutagawa, Masatake; Kinouchi, Yohsuke; Zhang, Qinyu

    2007-01-01

    Telemedicine systems have become an important supporting for the medical staffs. As the development of the mobile phones, it is possible to apply the mobile phones to be a part of telemedicine systems. We developed an innovative Remote Patient Monitoring System using a Java-enabled 3G mobile phone. By using this system, doctors can monitor the vital biosignals of patients in ICU/CCU, such as ECG, RESP, SpO2, EtCO2 and so on by using the real-time waveform and data monitoring and list trend data monitoring functions of installed Java jiglet application on the mobile phone. Futhermore, doctors can check the patients' information by using the patient information checking function. The 3G mobile phone used has the ability to implement the application as the same time as being used to mak a voice call. Therefore, the doctor can get more and more information both from the browsing the screen of the mobile phone and the communicating with the medical staffs who are beside the patients and the monitors. The system can be conducted to evaluate the diagnostic accuracy, efficiency, and safety of telediagnosis.

  17. The GFZ real-time GNSS precise positioning service system and its adaption for COMPASS

    NASA Astrophysics Data System (ADS)

    Li, Xingxing; Ge, Maorong; Zhang, Hongping; Nischan, Thomas; Wickert, Jens

    2013-03-01

    Motivated by the IGS real-time Pilot Project, GFZ has been developing its own real-time precise positioning service for various applications. An operational system at GFZ is now broadcasting real-time orbits, clocks, global ionospheric model, uncalibrated phase delays and regional atmospheric corrections for standard PPP, PPP with ambiguity fixing, single-frequency PPP and regional augmented PPP. To avoid developing various algorithms for different applications, we proposed a uniform algorithm and implemented it into our real-time software. In the new processing scheme, we employed un-differenced raw observations with atmospheric delays as parameters, which are properly constrained by real-time derived global ionospheric model or regional atmospheric corrections and by the empirical characteristics of the atmospheric delay variation in time and space. The positioning performance in terms of convergence time and ambiguity fixing depends mainly on the quality of the received atmospheric information and the spatial and temporal constraints. The un-differenced raw observation model can not only integrate PPP and NRTK into a seamless positioning service, but also syncretize these two techniques into a unique model and algorithm. Furthermore, it is suitable for both dual-frequency and sing-frequency receivers. Based on the real-time data streams from IGS, EUREF and SAPOS reference networks, we can provide services of global precise point positioning (PPP) with 5-10 cm accuracy, PPP with ambiguity-fixing of 2-5 cm accuracy, PPP using single-frequency receiver with accuracy of better than 50 cm and PPP with regional augmentation for instantaneous ambiguity resolution of 1-3 cm accuracy. We adapted the system for current COMPASS to provide PPP service. COMPASS observations from a regional network of nine stations are used for precise orbit determination and clock estimation in simulated real-time mode, the orbit and clock products are applied for real-time precise point positioning. The simulated real-time PPP service confirms that real-time positioning services of accuracy at dm-level and even cm-level is achievable with COMPASS only.

  18. Take the Reins on Model Quality with ModelCHECK and Gatekeeper

    NASA Technical Reports Server (NTRS)

    Jones, Corey

    2012-01-01

    Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.

  19. Global Real-Time Ocean Forecast System

    Science.gov Websites

    services. Marine Modeling and Analysis Branch Logo Click here to go to the MMAB home page Global Real-Time 17 Oct 2017 at 0Z, the Global RTOFS model has been upgraded to version 1.1.2. Changes include: The ). The global operational Real-Time Ocean Forecast System (Global RTOFS) at the National Centers for

  20. Strategies for Near Real Time Estimates of Precipitable Water Vapor from GPS Ground Receivers

    NASA Technical Reports Server (NTRS)

    Y., Bar-Sever; Runge, T.; Kroger, P.

    1995-01-01

    GPS-based estimates of precipitable water vapor (PWV) may be useful in numerical weather models to improve short-term weather predictions. To be effective in numerical weather prediction models, GPS PWV estimates must be produced with sufficient accuracy in near real time. Several estimation strategies for the near real time processing of GPS data are investigated.

  1. Real-Time Tropospheric Product Establishment and Accuracy Assessment in China

    NASA Astrophysics Data System (ADS)

    Chen, M.; Guo, J.; Wu, J.; Song, W.; Zhang, D.

    2018-04-01

    Tropospheric delay has always been an important issue in Global Navigation Satellite System (GNSS) processing. Empirical tropospheric delay models are difficult to simulate complex and volatile atmospheric environments, resulting in poor accuracy of the empirical model and difficulty in meeting precise positioning demand. In recent years, some scholars proposed to establish real-time tropospheric product by using real-time or near-real-time GNSS observations in a small region, and achieved some good results. This paper uses real-time observing data of 210 Chinese national GNSS reference stations to estimate the tropospheric delay, and establishes ZWD grid model in the country wide. In order to analyze the influence of tropospheric grid product on wide-area real-time PPP, this paper compares the method of taking ZWD grid product as a constraint with the model correction method. The results show that the ZWD grid product estimated based on the national reference stations can improve PPP accuracy and convergence speed. The accuracy in the north (N), east (E) and up (U) direction increase by 31.8 %,15.6 % and 38.3 %, respectively. As with the convergence speed, the accuracy of U direction experiences the most improvement.

  2. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    NASA Technical Reports Server (NTRS)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  3. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  4. Real-Time Parameter Estimation Using Output Error

    NASA Technical Reports Server (NTRS)

    Grauer, Jared A.

    2014-01-01

    Output-error parameter estimation, normally a post- ight batch technique, was applied to real-time dynamic modeling problems. Variations on the traditional algorithm were investigated with the goal of making the method suitable for operation in real time. Im- plementation recommendations are given that are dependent on the modeling problem of interest. Application to ight test data showed that accurate parameter estimates and un- certainties for the short-period dynamics model were available every 2 s using time domain data, or every 3 s using frequency domain data. The data compatibility problem was also solved in real time, providing corrected sensor measurements every 4 s. If uncertainty corrections for colored residuals are omitted, this rate can be increased to every 0.5 s.

  5. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  6. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  7. Change Semantic Constrained Online Data Cleaning Method for Real-Time Observational Data Stream

    NASA Astrophysics Data System (ADS)

    Ding, Yulin; Lin, Hui; Li, Rongrong

    2016-06-01

    Recent breakthroughs in sensor networks have made it possible to collect and assemble increasing amounts of real-time observational data by observing dynamic phenomena at previously impossible time and space scales. Real-time observational data streams present potentially profound opportunities for real-time applications in disaster mitigation and emergency response, by providing accurate and timeliness estimates of environment's status. However, the data are always subject to inevitable anomalies (including errors and anomalous changes/events) caused by various effects produced by the environment they are monitoring. The "big but dirty" real-time observational data streams can rarely achieve their full potential in the following real-time models or applications due to the low data quality. Therefore, timely and meaningful online data cleaning is a necessary pre-requisite step to ensure the quality, reliability, and timeliness of the real-time observational data. In general, a straightforward streaming data cleaning approach, is to define various types of models/classifiers representing normal behavior of sensor data streams and then declare any deviation from this model as normal or erroneous data. The effectiveness of these models is affected by dynamic changes of deployed environments. Due to the changing nature of the complicated process being observed, real-time observational data is characterized by diversity and dynamic, showing a typical Big (Geo) Data characters. Dynamics and diversity is not only reflected in the data values, but also reflected in the complicated changing patterns of the data distributions. This means the pattern of the real-time observational data distribution is not stationary or static but changing and dynamic. After the data pattern changed, it is necessary to adapt the model over time to cope with the changing patterns of real-time data streams. Otherwise, the model will not fit the following observational data streams, which may led to large estimation error. In order to achieve the best generalization error, it is an important challenge for the data cleaning methodology to be able to characterize the behavior of data stream distributions and adaptively update a model to include new information and remove old information. However, the complicated data changing property invalidates traditional data cleaning methods, which rely on the assumption of a stationary data distribution, and drives the need for more dynamic and adaptive online data cleaning methods. To overcome these shortcomings, this paper presents a change semantics constrained online filtering method for real-time observational data. Based on the principle that the filter parameter should vary in accordance to the data change patterns, this paper embeds semantic description, which quantitatively depicts the change patterns in the data distribution to self-adapt the filter parameter automatically. Real-time observational water level data streams of different precipitation scenarios are selected for testing. Experimental results prove that by means of this method, more accurate and reliable water level information can be available, which is prior to scientific and prompt flood assessment and decision-making.

  8. Intra-Urban Human Mobility and Activity Transition: Evidence from Social Media Check-In Data

    PubMed Central

    Wu, Lun; Zhi, Ye; Sui, Zhengwei; Liu, Yu

    2014-01-01

    Most existing human mobility literature focuses on exterior characteristics of movements but neglects activities, the driving force that underlies human movements. In this research, we combine activity-based analysis with a movement-based approach to model the intra-urban human mobility observed from about 15 million check-in records during a yearlong period in Shanghai, China. The proposed model is activity-based and includes two parts: the transition of travel demands during a specific time period and the movement between locations. For the first part, we find the transition probability between activities varies over time, and then we construct a temporal transition probability matrix to represent the transition probability of travel demands during a time interval. For the second part, we suggest that the travel demands can be divided into two classes, locationally mandatory activity (LMA) and locationally stochastic activity (LSA), according to whether the demand is associated with fixed location or not. By judging the combination of predecessor activity type and successor activity type we determine three trip patterns, each associated with a different decay parameter. To validate the model, we adopt the mechanism of an agent-based model and compare the simulated results with the observed pattern from the displacement distance distribution, the spatio-temporal distribution of activities, and the temporal distribution of travel demand transitions. The results show that the simulated patterns fit the observed data well, indicating that these findings open new directions for combining activity-based analysis with a movement-based approach using social media check-in data. PMID:24824892

  9. NASTRAN data generation and management using interactive graphics

    NASA Technical Reports Server (NTRS)

    Smootkatow, M.; Cooper, B. M.

    1972-01-01

    A method of using an interactive graphics device to generate a large portion of the input bulk data with visual checks of the structure and the card images is described. The generation starts from GRID and PBAR cards. The visual checks result from a three-dimensional display of the model in any rotated position. By detailing the steps, the time saving and cost effectiveness of this method may be judged, and its potential as a useful tool for the structural analyst may be established.

  10. Use of Combined Biogeochemical Model Approaches and Empirical Data to Assess Critical Loads of Nitrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenn, Mark E.; Driscoll, Charles; Zhou, Qingtao

    2015-01-01

    Empirical and dynamic biogeochemical modelling are complementary approaches for determining the critical load (CL) of atmospheric nitrogen (N) or other constituent deposition that an ecosystem can tolerate without causing ecological harm. The greatest benefits are obtained when these approaches are used in combination. Confounding environmental factors can complicate the determination of empirical CLs across depositional gradients, while the experimental application of N amendments for estimating the CL does not realistically mimic the effects of chronic atmospheric N deposition. Biogeochemical and vegetation simulation models can provide CL estimates and valuable ecosystem response information, allowing for past and future scenario testing withmore » various combinations of environmental factors, pollutants, pollutant control options, land management, and ecosystem response parameters. Even so, models are fundamentally gross simplifications of the real ecosystems they attempt to simulate. Empirical approaches are vital as a check on simulations and CL estimates, to parameterize models, and to elucidate mechanisms and responses under real world conditions. In this chapter, we provide examples of empirical and modelled N CL approaches in ecosystems from three regions of the United States: mixed conifer forest, desert scrub and pinyon- juniper woodland in California; alpine catchments in the Rocky Mountains; and lakes in the Adirondack region of New York state.« less

  11. Method and system to perform energy-extraction based active noise control

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul (Inventor); Joshi, Suresh M. (Inventor)

    2009-01-01

    A method to provide active noise control to reduce noise and vibration in reverberant acoustic enclosures such as aircraft, vehicles, appliances, instruments, industrial equipment and the like is presented. A continuous-time multi-input multi-output (MIMO) state space mathematical model of the plant is obtained via analytical modeling and system identification. Compensation is designed to render the mathematical model passive in the sense of mathematical system theory. The compensated system is checked to ensure robustness of the passive property of the plant. The check ensures that the passivity is preserved if the mathematical model parameters are perturbed from nominal values. A passivity-based controller is designed and verified using numerical simulations and then tested. The controller is designed so that the resulting closed-loop response shows the desired noise reduction.

  12. Near real time determination of the magnetopause and bow shock shape and position

    NASA Astrophysics Data System (ADS)

    Kartalev, M. D.; Keremidarska, V. I.; Grigorov, K. G.; Romanov, D. K.

    2002-03-01

    We present a web based near real time (once in 90 minutes) automated running of our 3D magnetosheath gasdynamic numerical model. (http://geospace.nat.bg). The determination of the shape and position of the bow shock and the magnetopause is a part of the solution. This approach of the model is utilizing the realistic semi-empirical Tsyganenko magnetosphere model T96-01 for ensuring the pressure balance at the magnetopause. In this realization, we use a real time ACE data, averaged over a 6 minutes time interval.

  13. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    NASA Technical Reports Server (NTRS)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  14. Local influence for generalized linear models with missing covariates.

    PubMed

    Shi, Xiaoyan; Zhu, Hongtu; Ibrahim, Joseph G

    2009-12-01

    In the analysis of missing data, sensitivity analyses are commonly used to check the sensitivity of the parameters of interest with respect to the missing data mechanism and other distributional and modeling assumptions. In this article, we formally develop a general local influence method to carry out sensitivity analyses of minor perturbations to generalized linear models in the presence of missing covariate data. We examine two types of perturbation schemes (the single-case and global perturbation schemes) for perturbing various assumptions in this setting. We show that the metric tensor of a perturbation manifold provides useful information for selecting an appropriate perturbation. We also develop several local influence measures to identify influential points and test model misspecification. Simulation studies are conducted to evaluate our methods, and real datasets are analyzed to illustrate the use of our local influence measures.

  15. 40 CFR 60.2780 - What must I include in the deviation report?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Compliance Times for Commercial and Industrial Solid Waste Incineration Units Model Rule-Recordkeeping and... downtime associated with zero, span, and other routine calibration checks). (f) Whether each deviation...

  16. Litho hotspots fixing using model based algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan

    2017-04-01

    As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.

  17. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  18. Model checking for linear temporal logic: An efficient implementation

    NASA Technical Reports Server (NTRS)

    Sherman, Rivi; Pnueli, Amir

    1990-01-01

    This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.

  19. Real-time subject-specific monitoring of internal deformations and stresses in the soft tissues of the foot: a new approach in gait analysis.

    PubMed

    Yarnitzky, G; Yizhar, Z; Gefen, A

    2006-01-01

    No technology is presently available to provide real-time information on internal deformations and stresses in plantar soft tissues of individuals during evaluation of the gait pattern. Because internal deformations and stresses in the plantar pad are critical factors in foot injuries such as diabetic foot ulceration, this severely limits evaluation of patients. To allow such real-time subject-specific analysis, we developed a hierarchal modeling system which integrates a two-dimensional gross structural model of the foot (high-order model) with local finite element (FE) models of the plantar tissue padding the calcaneus and medial metatarsal heads (low-order models). The high-order whole-foot model provides real-time analytical evaluations of the time-dependent plantar fascia tensile forces during the stance phase. These force evaluations are transferred, together with foot-shoe local reaction forces, also measured in real time (under the calcaneus, medial metatarsals and hallux), to the low-order FE models of the plantar pad, where they serve as boundary conditions for analyses of local deformations and stresses in the plantar pad. After careful verification of our custom-made FE solver and of our foot model system with respect to previous literature and against experimental results from a synthetic foot phantom, we conducted human studies in which plantar tissue loading was evaluated in real time during treadmill gait in healthy individuals (N = 4). We concluded that internal deformations and stresses in the plantar pad during gait cannot be predicted from merely measuring the foot-shoe force reactions. Internal loading of the plantar pad is constituted by a complex interaction between the anatomical structure and mechanical behavior of the foot skeleton and soft tissues, the body characteristics, the gait pattern and footwear. Real-time FE monitoring of internal deformations and stresses in the plantar pad is therefore required to identify elevated deformation/stress exposures toward utilizing it in gait laboratories to protect feet that are susceptible to injury.

  20. Automatic Classification of Station Quality by Image Based Pattern Recognition of Ppsd Plots

    NASA Astrophysics Data System (ADS)

    Weber, B.; Herrnkind, S.

    2017-12-01

    The number of seismic stations is growing and it became common practice to share station waveform data in real-time with the main data centers as IRIS, GEOFON, ORFEUS and RESIF. This made analyzing station performance of increasing importance for automatic real-time processing and station selection. The value of a station depends on different factors as quality and quantity of the data, location of the site and general station density in the surrounding area and finally the type of application it can be used for. The approach described by McNamara and Boaz (2006) became standard in the last decade. It incorporates a probability density function (PDF) to display the distribution of seismic power spectral density (PSD). The low noise model (LNM) and high noise model (HNM) introduced by Peterson (1993) are also displayed in the PPSD plots introduced by McNamara and Boaz allowing an estimation of the station quality. Here we describe how we established an automatic station quality classification module using image based pattern recognition on PPSD plots. The plots were split into 4 bands: short-period characteristics (0.1-0.8 s), body wave characteristics (0.8-5 s), microseismic characteristics (5-12 s) and long-period characteristics (12-100 s). The module sqeval connects to a SeedLink server, checks available stations, requests PPSD plots through the Mustang service from IRIS or PQLX/SQLX or from GIS (gempa Image Server), a module to generate different kind of images as trace plots, map plots, helicorder plots or PPSD plots. It compares the image based quality patterns for the different period bands with the retrieved PPSD plot. The quality of a station is divided into 5 classes for each of the 4 bands. Classes A, B, C, D define regular quality between LNM and HNM while the fifth class represents out of order stations with gain problems, missing data etc. Over all period bands about 100 different patterns are required to classify most of the stations available on the IRIS server. The results are written to a file and stations can be filtered by quality. AAAA represents the best quality in all 4 bands. Also a differentiation between instrument types as broad band and short period stations is possible. A regular check using the IRIS SeedLink and Mustang service allow users to be informed about new stations with a specific quality.

  1. Real-time numerical forecast of global epidemic spreading: case study of 2009 A/H1N1pdm.

    PubMed

    Tizzoni, Michele; Bajardi, Paolo; Poletto, Chiara; Ramasco, José J; Balcan, Duygu; Gonçalves, Bruno; Perra, Nicola; Colizza, Vittoria; Vespignani, Alessandro

    2012-12-13

    Mathematical and computational models for infectious diseases are increasingly used to support public-health decisions; however, their reliability is currently under debate. Real-time forecasts of epidemic spread using data-driven models have been hindered by the technical challenges posed by parameter estimation and validation. Data gathered for the 2009 H1N1 influenza crisis represent an unprecedented opportunity to validate real-time model predictions and define the main success criteria for different approaches. We used the Global Epidemic and Mobility Model to generate stochastic simulations of epidemic spread worldwide, yielding (among other measures) the incidence and seeding events at a daily resolution for 3,362 subpopulations in 220 countries. Using a Monte Carlo Maximum Likelihood analysis, the model provided an estimate of the seasonal transmission potential during the early phase of the H1N1 pandemic and generated ensemble forecasts for the activity peaks in the northern hemisphere in the fall/winter wave. These results were validated against the real-life surveillance data collected in 48 countries, and their robustness assessed by focusing on 1) the peak timing of the pandemic; 2) the level of spatial resolution allowed by the model; and 3) the clinical attack rate and the effectiveness of the vaccine. In addition, we studied the effect of data incompleteness on the prediction reliability. Real-time predictions of the peak timing are found to be in good agreement with the empirical data, showing strong robustness to data that may not be accessible in real time (such as pre-exposure immunity and adherence to vaccination campaigns), but that affect the predictions for the attack rates. The timing and spatial unfolding of the pandemic are critically sensitive to the level of mobility data integrated into the model. Our results show that large-scale models can be used to provide valuable real-time forecasts of influenza spreading, but they require high-performance computing. The quality of the forecast depends on the level of data integration, thus stressing the need for high-quality data in population-based models, and of progressive updates of validated available empirical knowledge to inform these models.

  2. Effects of experimental egg composition on rejection by Village Weavers (Ploceus cucullatus)

    USGS Publications Warehouse

    Prather, J.W.; Cruz, A.; Weaver, P.F.; Wiley, J.W.

    2007-01-01

    We experimentally parasitized nests of the Village Weaver (Ploceus cucullatus) in Hispaniola using real and artificial eggs made from wood and modeling clay. Artificial eggs were similar in size and shape to real weaver eggs and were coated with acrylic paint and glazed. Real eggs were actual weaver eggs taken from Village Weaver nests. Experimental parasitic eggs (1) mimicked natural weaver eggs, (2) differed in color only, (3) differed in spotting only, or (4) mimicked Shiny Cowbird (Molothrus bonariensis) egg color and spotting pattern. Parasitized nests were checked after 2-6 days. Real eggs were ejected from weaver nests with increasing frequency as they became less similar to the eggs in the nest with cowbirds eggs having the highest rejection (81%). However, for artificial egg types there were no significant within-composition differences in patterns of rejection. Clay eggs were usually ejected from the nests, whereas nests containing wood eggs often ended empty, or with only the artificial egg remaining in the nest. These patterns may reflect the differential ability of weavers to recognize and remove foreign eggs of different compositions from their nests. Researchers undertaking egg-rejection experiments should use real eggs either in addition or in place of artificial eggs to assess the cost of rejection and the coevolutionary relationships between parasite and host.

  3. [Real-time irrigation forecast of cotton mulched with plastic film under drip irrigation based on meteorological date].

    PubMed

    Shen, Xiao-jun; Sun, Jing-sheng; Li, Ming-si; Zhang, Ji-yang; Wang, Jing-lei; Li, Dong-wei

    2015-02-01

    It is important to improve the real-time irrigation forecasting precision by predicting real-time water consumption of cotton mulched with plastic film under drip irrigation based on meteorological data and cotton growth status. The model parameters for calculating ET0 based on Hargreaves formula were determined using historical meteorological data from 1953 to 2008 in Shihezi reclamation area. According to the field experimental data of growing season in 2009-2010, the model of computing crop coefficient Kc was established based on accumulated temperature. On the basis of crop water requirement (ET0) and Kc, a real-time irrigation forecast model was finally constructed, and it was verified by the field experimental data in 2011. The results showed that the forecast model had high forecasting precision, and the average absolute values of relative error between the predicted value and measured value were about 3.7%, 2.4% and 1.6% during seedling, squaring and blossom-boll forming stages, respectively. The forecast model could be used to modify the predicted values in time according to the real-time meteorological data and to guide the water management in local film-mulched cotton field under drip irrigation.

  4. 3D geological modeling of the transboundary Berzdorf-Radomierzyce basin in Upper Lusatia (Germany/Poland)

    NASA Astrophysics Data System (ADS)

    Woloszyn, Iwona; Merkel, Broder; Stanek, Klaus

    2017-07-01

    The management of natural resources has to follow the principles of sustainable development. Therefore, before starting new mining activities, it should be checked, whether existing deposits have been completely exploited. In this study, a three-dimensional (3D) cross-border geologic model was created to generalize the existing data of the Neogene Berzdorf-Radomierzyce basin, located in Upper Lusatia on the Polish-German border south of the city of Görlitz-Zgorzelec. The model based on boreholes and cross sections of abandoned and planned lignite fields was extended to the Bernstadt and Neisse-Ręczyn Graben, an important tectonic structure at the southern rim of the basin. The partly detailed stratigraphy of Neogene sequences was combined to five stratigraphic units, considering the lithological variations and the main tectonic structures. The model was used to check the ability of a further utilization of the Bernstadt and Neisse-Ręczyn Graben, containing lignite deposits. Moreover, it will serve as a basis for the construction of a 3D cross-border groundwater model, to investigate the groundwater flow and transport in the Miocene and Quaternary aquifer systems. The large amount of data and compatibility with other software favored the application of the 3D geo-modeling software Paradigm GOCAD. The results demonstrate a very good fit between model and real geological boundaries. This is particularly evident by matching the modeled surfaces to the implemented geological cross sections. The created model can be used for planning of full-scale mining operations in the eastern part of the basin (Radomierzyce).

  5. Validation of two commercial real-time RT-PCR kits for rapid and specific diagnosis of classical swine fever virus.

    PubMed

    Le Dimna, M; Vrancken, R; Koenen, F; Bougeard, S; Mesplède, A; Hutet, E; Kuntz-Simon, G; Le Potier, M F

    2008-01-01

    Two real-time RT-PCR kits, developed by LSI (TaqVet CSF) and ADIAGENE (Adiavet CSF), obtained an agreement to be commercialised in France, subject to conditions, defined by the French Classical Swine Fever (CSF) National Reference Laboratory. The producers were asked to introduce an internal control to check the RNA extraction efficacy. The different criteria assessed were sensitivity, "pestivirus specificity", reproducibility and ease of handling, using 189 different samples. These samples were either CSFV inactivated strains or blood/serum/organs collected from CSFV experimentally infected pigs or naturally infected wild boars. The reproducibility of the assays was confirmed by the analysis of a batch-to-batch panel control that was used for inter-laboratory tests involving nine laboratories. The two kits were also tested for the use in mass diagnostics and the results proved the kits to be suited using pools of blood, serum and tonsils. Moreover, a field evaluation, carried out on spleen samples collected from the CSF surveillance of wild boars in an area known to be infected and from domestic pigs at a slaughterhouse, confirmed the high sensitivity and specificity of the two kits. This step-by-step evaluation procedure confirmed that the two commercial CSF real-time RT-PCR kits have a higher predictive value than the current diagnostic standard, Virus Isolation.

  6. Improved safety for molecular diagnosis of classical rabies viruses by use of a TaqMan real-time reverse transcription-PCR "double check" strategy.

    PubMed

    Hoffmann, B; Freuling, C M; Wakeley, P R; Rasmussen, T B; Leech, S; Fooks, A R; Beer, M; Müller, T

    2010-11-01

    To improve the diagnosis of classical rabies virus with molecular methods, a validated, ready-to-use, real-time reverse transcription-PCR (RT-PCR) assay was developed. In a first step, primers and 6-carboxyfluorescien-labeled TaqMan probes specific for rabies virus were selected from the consensus sequence of the nucleoprotein gene of 203 different rabies virus sequences derived from GenBank. The selected primer-probe combination was highly specific and sensitive. During validation using a sample set of rabies virus strains from the virus archives of the Friedrich-Loeffler-Institut (FLI; Germany), the Veterinary Laboratories Agency (VLA; United Kingdom), and the DTU National Veterinary Institute (Lindholm, Denmark), covering the global diversity of rabies virus lineages, it was shown that both the newly developed assay and a previously described one had some detection failures. This was overcome by a combined assay that detected all samples as positive. In addition, the introduction of labeled positive controls (LPC) increased the diagnostic safety of the single as well as the combined assay. Based on the newly developed, alternative assay for the detection of rabies virus and the application of LPCs, an improved diagnostic sensitivity and reliability can be ascertained for postmortem and intra vitam real-time RT-PCR analyses in rabies reference laboratories.

  7. Spectral Deformation for Two-Body Dispersive Systems with e.g. the Yukawa Potential

    NASA Astrophysics Data System (ADS)

    Engelmann, Matthias; Rasmussen, Morten Grud

    2016-12-01

    We find an explicit closed formula for the k'th iterated commutator {{ad}Ak}(HV(ξ )) of arbitrary order k ⩾ 1 between a Hamiltonian HV(ξ )=M_{ω _{ξ }}+S_{\\check V} and a conjugate operator A={i}/2(v_{ξ}\\cdotnabla+nabla\\cdot v_{ξ}), where M_{ω _{ξ }} is the operator of multiplication with the real analytic function ω ξ which depends real analytically on the parameter ξ, and the operator S_{\\check V} is the operator of convolution with the (sufficiently nice) function \\check V, and v ξ is some vector field determined by ω ξ . Under certain assumptions, which are satisfied for the Yukawa potential, we then prove estimates of the form {{{ad}Ak}(HV(ξ ))(H0(ξ )+{i} )}|≤ C_{ξ }kk! where C ξ is some constant which depends continuously on ξ. The Hamiltonian is the fixed total momentum fiber Hamiltonian of an abstract two-body dispersive system and the work is inspired by a recent result [3] which, under conditions including estimates of the mentioned type, opens up for spectral deformation and analytic perturbation theory of embedded eigenvalues of finite multiplicity.

  8. Draft Forecasts from Real-Time Runs of Physics-Based Models - A Road to the Future

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOAA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  9. A New Model for Real-Time Regional Vertical Total Electron Content and Differential Code Bias Estimation Using IGS Real-Time Service (IGS-RTS) Products

    NASA Astrophysics Data System (ADS)

    Abdelazeem, Mohamed; Çelik, Rahmi N.; El-Rabbany, Ahmed

    2016-04-01

    The international global navigation satellite system (GNSS) real-time service (IGS-RTS) products have been used extensively for real-time precise point positioning and ionosphere modeling applications. In this study, we develop a regional model for real-time vertical total electron content (RT-VTEC) and differential code bias (RT-DCB) estimation over Europe using the IGS-RTS satellite orbit and clock products. The developed model has a spatial and temporal resolution of 1°×1° and 15 minutes, respectively. GPS observations from a regional network consisting of 60 IGS and EUREF reference stations are processed in the zero-difference mode using the Bernese-5.2 software package in order to extract the geometry-free linear combination of the smoothed code observations. The spherical harmonic expansion function is used to model the VTEC, the receiver and the satellite DCBs. To validate the proposed model, the RT-VTEC values are computed and compared with the final IGS-global ionospheric map (IGS-GIM) counterparts in three successive days under high solar activity including one of an extreme geomagnetic activity. The real-time satellite DCBs are also estimated and compared with the IGS-GIM counterparts. Moreover, the real-time receiver DCB for six IGS stations are obtained and compared with the IGS-GIM counterparts. The examined stations are located in different latitudes with different receiver types. The findings reveal that the estimated RT-VTEC values show agreement with the IGS-GIM counterparts with root mean-square-errors (RMSEs) values less than 2 TEC units. In addition, RMSEs of both the satellites and receivers DCBs are less than 0.85 ns and 0.65 ns, respectively in comparison with the IGS-GIM.

  10. Scalable Integrated Multi-Mission Support System Simulator Release 3.0

    NASA Technical Reports Server (NTRS)

    Kim, John; Velamuri, Sarma; Casey, Taylor; Bemann, Travis

    2012-01-01

    The Scalable Integrated Multi-mission Support System (SIMSS) is a tool that performs a variety of test activities related to spacecraft simulations and ground segment checks. SIMSS is a distributed, component-based, plug-and-play client-server system useful for performing real-time monitoring and communications testing. SIMSS runs on one or more workstations and is designed to be user-configurable or to use predefined configurations for routine operations. SIMSS consists of more than 100 modules that can be configured to create, receive, process, and/or transmit data. The SIMSS/GMSEC innovation is intended to provide missions with a low-cost solution for implementing their ground systems, as well as significantly reducing a mission s integration time and risk.

  11. Application Research of Quality Control Technology of Asphalt Pavement based on GPS Intelligent

    NASA Astrophysics Data System (ADS)

    Wang, Min; Gao, Bo; Shang, Fei; Wang, Tao

    2017-10-01

    Due to the difficulty of steel deck pavement asphalt layer compaction caused by the effect of the flexible supporting system (orthotropic steel deck plate), it is usually hard and difficult to control for the site compactness to reach the design goal. The intelligent compaction technology is based on GPS control technology and real-time acquisition of actual compaction tracks, and then forms a cloud maps of compaction times, which guide the roller operator to do the compaction in accordance with the design requirement to ensure the deck compaction technology and compaction quality. From the actual construction situation of actual bridge and checked data, the intelligent compaction technology is significant in guaranteeing the steel deck asphalt pavement compactness and quality stability.

  12. Asynchronous error-correcting secure communication scheme based on fractional-order shifting chaotic system

    NASA Astrophysics Data System (ADS)

    Chao, Luo

    2015-11-01

    In this paper, a novel digital secure communication scheme is firstly proposed. Different from the usual secure communication schemes based on chaotic synchronization, the proposed scheme employs asynchronous communication which avoids the weakness of synchronous systems and is susceptible to environmental interference. Moreover, as to the transmission errors and data loss in the process of communication, the proposed scheme has the ability to be error-checking and error-correcting in real time. In order to guarantee security, the fractional-order complex chaotic system with the shifting of order is utilized to modulate the transmitted signal, which has high nonlinearity and complexity in both frequency and time domains. The corresponding numerical simulations demonstrate the effectiveness and feasibility of the scheme.

  13. Real-time implementation of biofidelic SA1 model for tactile feedback.

    PubMed

    Russell, A F; Armiger, R S; Vogelstein, R J; Bensmaia, S J; Etienne-Cummings, R

    2009-01-01

    In order for the functionality of an upper-limb prosthesis to approach that of a real limb it must be able to, accurately and intuitively, convey sensory feedback to the limb user. This paper presents results of the real-time implementation of a 'biofidelic' model that describes mechanotransduction in Slowly Adapting Type 1 (SA1) afferent fibers. The model accurately predicts the timing of action potentials for arbitrary force or displacement stimuli and its output can be used as stimulation times for peripheral nerve stimulation by a neuroprosthetic device. The model performance was verified by comparing the predicted action potential (or spike) outputs against measured spike outputs for different vibratory stimuli. Furthermore experiments were conducted to show that, like real SA1 fibers, the model's spike rate varies according to input pressure and that a periodic 'tapping' stimulus evokes periodic spike outputs.

  14. 78 FR 9783 - Airworthiness Directives; Bombardier, Inc. Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-12

    ...) has been made to the Time Limits/ Maintenance Checks (TLMC) manual to introduce a new Airworthiness... have already been done. (g) Time Limits/Maintenance Checks (TLMC) Manual Revision Within 60 days after... Challenger Time Limits/Maintenance Checks Manual, PSP 605. (ii) Canadair Challenger Temporary Revision 5-250...

  15. Real-time simulation of hand motion for prosthesis control

    PubMed Central

    Blana, Dimitra; Chadwick, Edward K.; van den Bogert, Antonie J.; Murray, Wendy M.

    2016-01-01

    Individuals with hand amputation suffer substantial loss of independence. Performance of sophisticated prostheses is limited by the ability to control them. To achieve natural and simultaneous control of all wrist and hand motions, we propose to use real-time biomechanical simulation to map between residual EMG and motions of the intact hand. Here we describe a musculoskeletal model of the hand using only extrinsic muscles to determine whether real-time performance is possible. Simulation is 1.3 times faster than real time, but the model is locally unstable. Methods are discussed to increase stability and make this approach suitable for prosthesis control. PMID:27868425

  16. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy Disclosure and Notices C Appendix C to Part 229 Banks and... OF FUNDS AND COLLECTION OF CHECKS (REGULATION CC) Pt. 229, App. C Appendix C to Part 229—Model...

  17. A real time Pegasus propulsion system model for VSTOL piloted simulation evaluation

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Roth, S. P.; Creekmore, R.

    1981-01-01

    A real time propulsion system modeling technique suitable for use in man-in-the-loop simulator studies was developd. This technique provides the system accuracy, stability, and transient response required for integrated aircraft and propulsion control system studies. A Pegasus-Harrier propulsion system was selected as a baseline for developing mathematical modeling and simulation techniques for VSTOL. Initially, static and dynamic propulsion system characteristics were modeled in detail to form a nonlinear aerothermodynamic digital computer simulation of a Pegasus engine. From this high fidelity simulation, a real time propulsion model was formulated by applying a piece-wise linear state variable methodology. A hydromechanical and water injection control system was also simulated. The real time dynamic model includes the detail and flexibility required for the evaluation of critical control parameters and propulsion component limits over a limited flight envelope. The model was programmed for interfacing with a Harrier aircraft simulation. Typical propulsion system simulation results are presented.

  18. Computing Quantitative Characteristics of Finite-State Real-Time Systems

    DTIC Science & Technology

    1994-05-04

    Current methods for verifying real - time systems are essentially decision procedures that establish whether the system model satisfies a given...specification. We present a general method for computing quantitative information about finite-state real - time systems . We have developed algorithms that...our technique can be extended to a more general representation of real - time systems , namely, timed transition graphs. The algorithms presented in this

  19. Tile-Image Merging and Delivering for Virtual Camera Services on Tiled-Display for Real-Time Remote Collaboration

    NASA Astrophysics Data System (ADS)

    Choe, Giseok; Nang, Jongho

    The tiled-display system has been used as a Computer Supported Cooperative Work (CSCW) environment, in which multiple local (and/or remote) participants cooperate using some shared applications whose outputs are displayed on a large-scale and high-resolution tiled-display, which is controlled by a cluster of PC's, one PC per display. In order to make the collaboration effective, each remote participant should be aware of all CSCW activities on the titled display system in real-time. This paper presents a capturing and delivering mechanism of all activities on titled-display system to remote participants in real-time. In the proposed mechanism, the screen images of all PC's are periodically captured and delivered to the Merging Server that maintains separate buffers to store the captured images from the PCs. The mechanism selects one tile image from each buffer, merges the images to make a screen shot of the whole tiled-display, clips a Region of Interest (ROI), compresses and streams it to remote participants in real-time. A technical challenge in the proposed mechanism is how to select a set of tile images, one from each buffer, for merging so that the tile images displayed at the same time on the tiled-display can be properly merged together. This paper presents three selection algorithms; a sequential selection algorithm, a capturing time based algorithm, and a capturing time and visual consistency based algorithm. It also proposes a mechanism of providing several virtual cameras on tiled-display system to remote participants by concurrently clipping several different ROI's from the same merged tiled-display images, and delivering them after compressing with video encoders requested by the remote participants. By interactively changing and resizing his/her own ROI, a remote participant can check the activities on the tiled-display effectively. Experiments on a 3 × 2 tiled-display system show that the proposed merging algorithm can build a tiled-display image stream synchronously, and the ROI-based clipping and delivering mechanism can provide individual views on the tiled-display system to multiple remote participants in real-time.

  20. Performance of point-of-care Xpert HIV-1 plasma viral load assay at a tertiary HIV care centre in Southern India.

    PubMed

    Swathirajan, Chinnambedu Ravichandran; Vignesh, Ramachandran; Boobalan, Jayaseelan; Solomon, Sunil Suhas; Saravanan, Shanmugam; Balakrishnan, Pachamuthu

    2017-10-01

    Sustainable suppression of HIV replication forms the basis of anti-retroviral therapy (ART) medication. Thus, reliable quantification of HIV viral load has become an essential factor to monitor the effectiveness of the ART. Longer turnaround-time (TAT), batch testing and technical skills are major drawbacks of standard real-time PCR assays. The performance of the point-of-care Xpert HIV-1 viral load assay was evaluated against the Abbott RealTime PCR m2000rt system. A total of 96 plasma specimens ranging from 2.5 log10 copies ml -1 to 4.99 log10 copies ml -1 and proficiency testing panel specimens were used. Precision and accuracy were checked using the Pearson correlation co-efficient test and Bland-Altman analysis. Compared to the Abbott RealTime PCR, the Xpert HIV-1 viral load assay showed a good correlation (Pearson r=0.81; P<0.0001) with a mean difference of 0.27 log10 copies ml -1 (95 % CI, -0.41 to 0.96 log10 copies ml -1 ; sd, 0.35 log10 copies ml -1 ). Reliable and ease of testing individual specimens could make the Xpert HIV-1 viral load assay an efficient alternative method for ART monitoring in clinical management of HIV disease in resource-limited settings. The rapid test results (less than 2 h) could help in making an immediate clinical decision, which further strengthens patient care.

  1. Comparison of in-house and commercial real time-PCR based carbapenemase gene detection methods in Enterobacteriaceae and non-fermenting gram-negative bacterial isolates.

    PubMed

    Smiljanic, M; Kaase, M; Ahmad-Nejad, P; Ghebremedhin, B

    2017-07-10

    Carbapenemase-producing gram-negative bacteria are increasing globally and have been associated with outbreaks in hospital settings. Thus, the accurate detection of these bacteria in infections is mandatory for administering the adequate therapy and infection control measures. This study aimed to establish and evaluate a multiplex real-time PCR assay for the simultaneous detection of carbapenemase gene variants in gram-negative rods and to compare the performance with a commercial RT-PCR assay (Check-Direct CPE). 116 carbapenem-resistant Enterobacteriaceae, Pseudomonas aeruginosa and Acinetobacter baumannii isolates were genotyped for carbapenemase genes by PCR and sequencing. The defined isolates were used for the validation of the in-house RT-PCR by use of designed primer pairs and probes. Among the carbapenem-resistant isolates the genes bla KPC , bla VIM , bla NDM or bla OXA were detected. Both RT-PCR assays detected all bla KPC , bla VIM and bla NDM in the isolates. The in-house RT-PCR detected 53 of 67 (79.0%) whereas the commercial assay detected only 29 (43.3%) of the OXA genes. The in-house sufficiently distinguished the most prevalent OXA types (23-like and 48-like) in the melting curve analysis and direct detection of the genes from positive blood culture vials. The Check-Direct CPE and the in-house RT-PCR assay detected the carbapenem resistance from solid culture isolates. Moreover, the in-house assay enabled the identification of carbapenemase genes directly from positive blood-culture vials. However, we observed insufficient detection of various OXA genes in both assays. Nevertheless, the in-house RT-PCR detected the majority of the OXA type genes in Enterobacteriaceae and A. baumannii.

  2. Deriving Tools from Real-time Runs: A New CCMC Support for SEC and AFWA

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions. the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models. and on the transition of appropriate models to space weather forecast centers. As part of the latter activity. the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  3. Can Subjects be Guided to Optimal Decisions The Use of a Real-Time Training Intervention Model

    DTIC Science & Technology

    2016-06-01

    execution of the task and may then be analyzed to determine if there is correlation between designated factors (scores, proportion of time in each...state with their decision performance in real time could allow training systems to be designed to tailor training to the individual decision maker...release; distribution is unlimited CAN SUBJECTS BE GUIDED TO OPTIMAL DECISIONS? THE USE OF A REAL- TIME TRAINING INTERVENTION MODEL by Travis D

  4. Size-scaling behaviour of the electronic polarizability of one-dimensional interacting systems

    NASA Astrophysics Data System (ADS)

    Chiappe, G.; Louis, E.; Vergés, J. A.

    2018-05-01

    Electronic polarizability of finite chains is accurately calculated from the total energy variation of the system produced by small but finite static electric fields applied along the chain direction. Normalized polarizability, that is, polarizability divided by chain length, diverges as the second power of length for metallic systems but approaches a constant value for insulating systems. This behaviour provides a very convenient way to characterize the wave-function malleability of finite systems as it avoids the need of attaching infinite contacts to the chain ends. Hubbard model calculations at half filling show that the method works for a small U  =  1 interaction value that corresponds to a really small spectral gap of 0.005 (hopping t  =  ‑1 is assumed). Once successfully checked, the method has been applied to the long-range hopping model of Gebhard and Ruckenstein showing 1/r hopping decay (Gebhard and Ruckenstein 1992 Phys. Rev. Lett. 68 244; Gebhard et al 1994 Phys. Rev. B 49 10926). Metallicity for U values below the reported metal-insulator transition is obtained but the surprise comes for U values larger than the critical one (when a gap appears in the spectral density of states) because a steady increase of the normalized polarizability with size is obtained. This critical size-scaling behaviour can be understood as corresponding to a molecule which polarizability is unbounded. We have checked that a real transfer of charge from one chain end to the opposite occurs as a response to very small electric fields in spite of the existence of a large gap of the order of U for one-particle excitations. Finally, ab initio quantum chemistry calculations of realistic poly-acetylene chains prove that the occurrence of such critical behaviour in real systems is unlikely.

  5. Real-Time Parameter Estimation in the Frequency Domain

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2000-01-01

    A method for real-time estimation of parameters in a linear dynamic state-space model was developed and studied. The application is aircraft dynamic model parameter estimation from measured data in flight. Equation error in the frequency domain was used with a recursive Fourier transform for the real-time data analysis. Linear and nonlinear simulation examples and flight test data from the F-18 High Alpha Research Vehicle were used to demonstrate that the technique produces accurate model parameter estimates with appropriate error bounds. Parameter estimates converged in less than one cycle of the dominant dynamic mode, using no a priori information, with control surface inputs measured in flight during ordinary piloted maneuvers. The real-time parameter estimation method has low computational requirements and could be implemented

  6. Adaptive Automation Design and Implementation

    DTIC Science & Technology

    2015-09-17

    Study : Space Navigator This section demonstrates the player modeling paradigm, focusing specifically on the response generation section of the player ...human-machine system, a real-time player modeling framework for imitating a specific person’s task performance, and the Adaptive Automation System...Model . . . . . . . . . . . . . . . . . . . . . . . 13 Clustering-Based Real-Time Player Modeling . . . . . . . . . . . . . . . . . . . . . . 15 An

  7. Characterizing Intra-Urban Air Quality Gradients with a Spatially-Distributed Network

    NASA Astrophysics Data System (ADS)

    Zimmerman, N.; Ellis, A.; Schurman, M. I.; Gu, P.; Li, H.; Snell, L.; Gu, J.; Subramanian, R.; Robinson, A. L.; Apte, J.; Presto, A. A.

    2016-12-01

    City-wide air pollution measurements have typically relied on regulatory or research monitoring sites with low spatial density to assess population-scale exposure. However, air pollutant concentrations exhibit significant spatial variability depending on local sources and features of the built environment, which may not be well captured by the existing monitoring regime. To better understand urban spatial and temporal pollution gradients at 1 km resolution, a network of 12 real-time air quality monitoring stations was deployed beginning July 2016 in Pittsburgh, PA. The stations were deployed at sites along an urban-rural transect and in urban locations with a range of traffic, restaurant, and tall building densities to examine the impact of various modifiable factors. Measurements from the stationary monitoring stations were further supported by mobile monitoring, which provided higher spatial resolution pollutant measurements on nearby roadways and enabled routine calibration checks. The stationary monitoring measurements comprise ultrafine particle number (Aerosol Dynamics "MAGIC" CPC), PM2.5 (Met One Neighborhood PM Monitor), black carbon (Met One BC 1050), and a new low-cost air quality monitor, the Real-time Affordable Multi-Pollutant (RAMP) sensor package for measuring CO, NO2, SO2, O3, CO2, temperature and relative humidity. High time-resolution (sub-minute) measurements across the distributed monitoring network enable insight into dynamic pollutant behaviour. Our preliminary findings show that our instruments are sensitive to PM2.5 gradients exceeding 2 micro-grams per cubic meter and ultrafine particle gradients exceeding 1000 particles per cubic centimeter. Additionally, we have developed rigorous calibration protocols to characterize the RAMP sensor response and drift, as well as multiple linear regression models to convert sensor response into pollutant concentrations that are comparable to reference instrumentation.

  8. Next generation of space based sensor for application in the SSA space weather domain.

    NASA Astrophysics Data System (ADS)

    Jansen, Frank; Kudela, Karel; Behrens, Joerg

    Next generation of space based sensor for application in the SSA space weather domain. F. Jansen1, K. Kudela2, J. Behrens1 and NESTEC consortium3 1DLR, Bremen, Germany 2IEP SAS Kosice, Slovakia 3NESTEC consortium members (DLR Bremen, DESY Hamburg, MPS Katlenburg-Lindau, CTU Prague, University of Twente, IEP-SAS Kosice, UCL/MSSL, University of Manchester, University of Surrey, Hermanus Magnetic Observatory, North-West University Potchefsroom, University of Montreal) High energy solar and galactic cosmic rays have twofold importance for the SSA space weather domain. Cosmic rays have dangerous effects for space, air and ground based assets, but on the other side cosmic rays are direct measure tools for real time space weather warning. A review of space weather related SSA results from operating global cosmic ray networks (especially those by neutron monitors and by muon directional telescopes), its limitations and main questions to be solved, is presented. Especially those recent results, received in real time and with high temporal resolution, are reviewed and discussed. In addition the relevance of these monitors and telescopes in forecasting geomagnetic disturbances are checked. Based on this study result, a next generation of highly miniaturized hybrid silicon pixel device (Medipix sensor) will be described for the following, beyond state-of-the-art application: a SSA satellite for high energy solar and galactic cosmic ray spectrum measurement, with a space plasma environment data package and CME real time imaging by means of cosmic rays. All data management and processing will be carried out on the satellite in real time. Insofar a high reduction of data and transmission to ground station of finalized space weather relevant data and images are foreseen.

  9. Loop-Mediated Isothermal Amplification for Detection of Endogenous Sad1 Gene in Cotton: An Internal Control for Rapid Onsite GMO Testing.

    PubMed

    Singh, Monika; Bhoge, Rajesh K; Randhawa, Gurinderjit

    2018-04-20

    Background : Confirming the integrity of seed samples in powdered form is important priorto conducting a genetically modified organism (GMO) test. Rapid onsite methods may provide a technological solution to check for genetically modified (GM) events at ports of entry. In India, Bt cotton is the commercialized GM crop with four approved GM events; however, 59 GM events have been approved globally. GMO screening is required to test for authorized GM events. The identity and amplifiability of test samples could be ensured first by employing endogenous genes as an internal control. Objective : A rapid onsite detection method was developed for an endogenous reference gene, stearoyl acyl carrier protein desaturase ( Sad1 ) of cotton, employing visual and real-time loop-mediated isothermal amplification (LAMP). Methods : The assays were performed at a constant temperature of 63°C for 30 min for visual LAMP and 62ºC for 40 min for real-time LAMP. Positive amplification was visualized as a change in color from orange to green on addition of SYBR ® Green or detected as real-time amplification curves. Results : Specificity of LAMP assays was confirmed using a set of 10 samples. LOD for visual LAMP was up to 0.1%, detecting 40 target copies, and for real-time LAMP up to 0.05%, detecting 20 target copies. Conclusions : The developed methods could be utilized to confirm the integrity of seed powder prior to conducting a GMO test for specific GM events of cotton. Highlights : LAMP assays for the endogenous Sad1 gene of cotton have been developed to be used as an internal control for onsite GMO testing in cotton.

  10. EMC: Verification

    Science.gov Websites

    , GFS, RAP, HRRR, HIRESW, SREF mean, International Global Models, HPC analysis Precipitation Skill Scores : 1995-Present NAM, GFS, NAM CONUS nest, International Models EMC Forecast Verfication Stats: NAM ) Real Time Verification of NCEP Operational Models against observations Real Time Verification of NCEP

  11. General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark

    2010-01-01

    Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.

  12. Real-time stylistic prediction for whole-body human motions.

    PubMed

    Matsubara, Takamitsu; Hyon, Sang-Ho; Morimoto, Jun

    2012-01-01

    The ability to predict human motion is crucial in several contexts such as human tracking by computer vision and the synthesis of human-like computer graphics. Previous work has focused on off-line processes with well-segmented data; however, many applications such as robotics require real-time control with efficient computation. In this paper, we propose a novel approach called real-time stylistic prediction for whole-body human motions to satisfy these requirements. This approach uses a novel generative model to represent a whole-body human motion including rhythmic motion (e.g., walking) and discrete motion (e.g., jumping). The generative model is composed of a low-dimensional state (phase) dynamics and a two-factor observation model, allowing it to capture the diversity of motion styles in humans. A real-time adaptation algorithm was derived to estimate both state variables and style parameter of the model from non-stationary unlabeled sequential observations. Moreover, with a simple modification, the algorithm allows real-time adaptation even from incomplete (partial) observations. Based on the estimated state and style, a future motion sequence can be accurately predicted. In our implementation, it takes less than 15 ms for both adaptation and prediction at each observation. Our real-time stylistic prediction was evaluated for human walking, running, and jumping behaviors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Real-time simulation of the nonlinear visco-elastic deformations of soft tissues.

    PubMed

    Basafa, Ehsan; Farahmand, Farzam

    2011-05-01

    Mass-spring-damper (MSD) models are often used for real-time surgery simulation due to their fast response and fairly realistic deformation replication. An improved real time simulation model of soft tissue deformation due to a laparoscopic surgical indenter was developed and tested. The mechanical realization of conventional MSD models was improved using nonlinear springs and nodal dampers, while their high computational efficiency was maintained using an adapted implicit integration algorithm. New practical algorithms for model parameter tuning, collision detection, and simulation were incorporated. The model was able to replicate complex biological soft tissue mechanical properties under large deformations, i.e., the nonlinear and viscoelastic behaviors. The simulated response of the model after tuning of its parameters to the experimental data of a deer liver sample, closely tracked the reference data with high correlation and maximum relative differences of less than 5 and 10%, for the tuning and testing data sets respectively. Finally, implementation of the proposed model and algorithms in a graphical environment resulted in a real-time simulation with update rates of 150 Hz for interactive deformation and haptic manipulation, and 30 Hz for visual rendering. The proposed real time simulation model of soft tissue deformation due to a laparoscopic surgical indenter was efficient, realistic, and accurate in ex vivo testing. This model is a suitable candidate for testing in vivo during laparoscopic surgery.

  14. Spectral decontamination of a real-time helicopter simulation

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.

    1983-01-01

    Nonlinear mathematical models of a rotor system, referred to as rotating blade-element models, produce steady-state, high-frequency harmonics of significant magnitude. In a discrete simulation model, certain of these harmonics may be incompatible with realistic real-time computational constraints because of their aliasing into the operational low-pass region. However, the energy is an aliased harmonic may be suppressed by increasing the computation rate of an isolated, causal nonlinearity and using an appropriate filter. This decontamination technique is applied to Sikorsky's real-time model of the Black Hawk helicopter, as supplied to NASA for handling-qualities investigations.

  15. Designing Real-Time Systems in Ada (Trademark).

    DTIC Science & Technology

    1986-01-01

    e a. T * .K Ada .e 6 4J (FINAL REPORT) Real - Time Systems in Ada* Abstract Real-time software differs from other kinds of software in the sense that it...1-2 1.2.2 Functional Focus ...... ................ 1-2 1.3 ROLE OF ADA IN REAL - TIME SYSTEMS DESIGN. ..... 1-3 1.4 SCOPE OF THIS...MODELS OF REAL TIME SYSTEMS 8.1 REQUIREMENTS FOR TEMPORAL BEHAVIOR ANALYSIS . 8-1 8.2 METHODS OF TEMPORAL BEHAVIOR ANALYSIS.... ....... 8-4 8.3

  16. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  17. Model Checking a Byzantine-Fault-Tolerant Self-Stabilizing Protocol for Distributed Clock Synchronization Systems

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2007-01-01

    This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.

  18. Real-time fault diagnosis for propulsion systems

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Guo, Ten-Huei; Delaat, John C.; Duyar, Ahmet

    1991-01-01

    Current research toward real time fault diagnosis for propulsion systems at NASA-Lewis is described. The research is being applied to both air breathing and rocket propulsion systems. Topics include fault detection methods including neural networks, system modeling, and real time implementations.

  19. Implementation of a new fuzzy vector control of induction motor.

    PubMed

    Rafa, Souad; Larabi, Abdelkader; Barazane, Linda; Manceur, Malik; Essounbouli, Najib; Hamzaoui, Abdelaziz

    2014-05-01

    The aim of this paper is to present a new approach to control an induction motor using type-1 fuzzy logic. The induction motor has a nonlinear model, uncertain and strongly coupled. The vector control technique, which is based on the inverse model of the induction motors, solves the coupling problem. Unfortunately, in practice this is not checked because of model uncertainties. Indeed, the presence of the uncertainties led us to use human expertise such as the fuzzy logic techniques. In order to maintain the decoupling and to overcome the problem of the sensitivity to the parametric variations, the field-oriented control is replaced by a new block control. The simulation results show that the both control schemes provide in their basic configuration, comparable performances regarding the decoupling. However, the fuzzy vector control provides the insensitivity to the parametric variations compared to the classical one. The fuzzy vector control scheme is successfully implemented in real-time using a digital signal processor board dSPACE 1104. The efficiency of this technique is verified as well as experimentally at different dynamic operating conditions such as sudden loads change, parameter variations, speed changes, etc. The fuzzy vector control is found to be a best control for application in an induction motor. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  20. A novel Online-to-Offline (O2O) model for pre-exposure prophylaxis and HIV testing scale up

    PubMed Central

    Anand, Tarandeep; Nitpolprasert, Chattiya; Trachunthong, Deondara; Kerr, Stephen J; Janyam, Surang; Linjongrat, Danai; Hightow-Weidman, Lisa B; Phanuphak, Praphan; Ananworanich, Jintanat; Phanuphak, Nittaya

    2017-01-01

    Abstract Introduction: PrEP awareness and uptake among men who have sex with men (MSM) and transgender women (TG) in Thailand remains low. Finding ways to increase HIV testing and PrEP uptake among high-risk groups is a critical priority. This study evaluates the effect of a novel Adam’s Love Online-to-Offline (O2O) model on PrEP and HIV testing uptake among Thai MSM and TG and identifies factors associated with PrEP uptake. Methods: The O2O model was piloted by Adam’s Love (www.adamslove.org) HIV educational and counselling website. MSM and TG reached online by PrEP promotions and interested in free PrEP and/or HIV testing services contacted Adam’s Love online staff, received real-time PrEP eCounseling, and completed online bookings for receiving services at one of the four sites in Bangkok based on their preference. Auto-generated site- and service-specific e-tickets and Quick Response (QR) codes were sent to their mobile devices enabling monitoring and check-in by offline site staff. Service uptake and participant’s socio-demographic and risk behaviour characteristics were analyzed. Factors associated with PrEP uptake were assessed using multiple logistic regression. Results: Between January 10th and April 11th, 2016, Adam’s Love reached 272,568 people online via the PrEP O2O promotions. 425 MSM and TG received eCounseling and e-tickets. There were 325 (76.5%) MSM and TG who checked-in at clinics and received HIV testing. Nine (2.8%) were diagnosed with HIV infection. Median (IQR) time between receiving the e-ticket and checking-in was 3 (0–7) days. Of 316 HIV-negative MSM and TG, 168 (53.2%) started PrEP. In a multivariate model, higher education (OR 2.30, 95%CI 1.14–4.66; p = 0.02), seeking sex partners online (OR 2.05, 95%CI 1.19–3.54; p = 0.009), being aware of sexual partners’ HIV status (OR 2.37, 95%CI 1.29–4.35; p = 0.008), ever previously using post-exposure prophylaxis (PEP) (OR 2.46, 95%CI 1.19–5.09; p = 0.01), and enrolment at Adam’s Love clinic compared to the other three sites (OR 3.79, 95%CI 2.06–6.95; p < 0.001) were independently associated with PrEP uptake. Conclusions: Adam’s Love O2O model is highly effective in linking online at-risk MSM and TG to PrEP and HIV testing services, and has high potential to be replicated and scaled up in other settings with high Internet penetration among key populations. PMID:28362062

  1. The influence of social anxiety on the body checking behaviors of female college students.

    PubMed

    White, Emily K; Warren, Cortney S

    2014-09-01

    Social anxiety and eating pathology frequently co-occur. However, there is limited research examining the relationship between anxiety and body checking, aside from one study in which social physique anxiety partially mediated the relationship between body checking cognitions and body checking behavior (Haase, Mountford, & Waller, 2007). In an independent sample of 567 college women, we tested the fit of Haase and colleagues' foundational model but did not find evidence of mediation. Thus we tested the fit of an expanded path model that included eating pathology and clinical impairment. In the best-fitting path model (CFI=.991; RMSEA=.083) eating pathology and social physique anxiety positively predicted body checking, and body checking positively predicted clinical impairment. Therefore, women who endorse social physique anxiety may be more likely to engage in body checking behaviors and experience impaired psychosocial functioning. Published by Elsevier Ltd.

  2. HRT-UML: a design method for hard real-time systems based on the UML notation

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Massimo; Mazzini, Silvia; di Natale, Marco; Lipari, Giuseppe

    2002-07-01

    The Hard Real-Time-Unified Modelling Language (HRT-UML) method aims at providing a comprehensive solution to the modeling of Hard Real Time systems. The experience shows that the design of Hard Real-Time systems needs methodologies suitable for the modeling and analysis of aspects related to time, schedulability and performance. In the context of the European Aerospace community a reference method for design is Hierarchical Object Oriented Design (HOOD) and in particular its extension for the modeling of hard real time systems, Hard Real-Time-Hierarchical Object Oriented Design (HRT-HOOD), recommended by the European Space Agency (ESA) for the development of on-board systems. On the other hand in recent years the Unified Modelling Language (UML) has been gaining a very large acceptance in a wide range of domains, all over the world, becoming a de-facto international standard. Tool vendors are very active in this potentially big market. In the Aerospace domain the common opinion is that UML, as a general notation, is not suitable for Hard Real Time systems, even if its importance is recognized as a standard and as a technological trend in the near future. These considerations suggest the possibility of replacing the HRT-HOOD method with a customized version of UML, that incorporates the advantages of both standards and complements the weak points. This approach has the clear advantage of making HRT-HOOD converge on a more powerful and expressive modeling notation. The paper identifies a mapping of the HRT-HOOD semantics into the UML one, and proposes a UML extension profile, that we call HRT-UML, based on the UML standard extension mechanisms, to fully represent HRT-HOOD design concepts. Finally it discusses the relationships between our profile and the UML profile for schedulability, performance and time, adopted by OMG in November 2001.

  3. Development of a model-based flood emergency management system in Yujiang River Basin, South China

    NASA Astrophysics Data System (ADS)

    Zeng, Yong; Cai, Yanpeng; Jia, Peng; Mao, Jiansu

    2014-06-01

    Flooding is the most frequent disaster in China. It affects people's lives and properties, causing considerable economic loss. Flood forecast and operation of reservoirs are important in flood emergency management. Although great progress has been achieved in flood forecast and reservoir operation through using computer, network technology, and geographic information system technology in China, the prediction accuracy of models are not satisfactory due to the unavailability of real-time monitoring data. Also, real-time flood control scenario analysis is not effective in many regions and can seldom provide online decision support function. In this research, a decision support system for real-time flood forecasting in Yujiang River Basin, South China (DSS-YRB) is introduced in this paper. This system is based on hydrological and hydraulic mathematical models. The conceptual framework and detailed components of the proposed DSS-YRB is illustrated, which employs real-time rainfall data conversion, model-driven hydrologic forecasting, model calibration, data assimilation methods, and reservoir operational scenario analysis. Multi-tiered architecture offers great flexibility, portability, reusability, and reliability. The applied case study results show the development and application of a decision support system for real-time flood forecasting and operation is beneficial for flood control.

  4. REAL-TIME MODELING OF MOTOR VEHICLE EMISSIONS FOR ESTIMATING HUMAN EXPOSURES NEAR ROADWAYS

    EPA Science Inventory

    The United States Environmental Protection Agency's (EPA) National Exposure Research Laboratory is developing a real-time model of motor vehicle emissions to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop ...

  5. Real-time monitoring of a microbial electrolysis cell using an electrical equivalent circuit model.

    PubMed

    Hussain, S A; Perrier, M; Tartakovsky, B

    2018-04-01

    Efforts in developing microbial electrolysis cells (MECs) resulted in several novel approaches for wastewater treatment and bioelectrosynthesis. Practical implementation of these approaches necessitates the development of an adequate system for real-time (on-line) monitoring and diagnostics of MEC performance. This study describes a simple MEC equivalent electrical circuit (EEC) model and a parameter estimation procedure, which enable such real-time monitoring. The proposed approach involves MEC voltage and current measurements during its operation with periodic power supply connection/disconnection (on/off operation) followed by parameter estimation using either numerical or analytical solution of the model. The proposed monitoring approach is demonstrated using a membraneless MEC with flow-through porous electrodes. Laboratory tests showed that changes in the influent carbon source concentration and composition significantly affect MEC total internal resistance and capacitance estimated by the model. Fast response of these EEC model parameters to changes in operating conditions enables the development of a model-based approach for real-time monitoring and fault detection.

  6. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  7. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  8. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  9. Real-time individualization of the unified model of performance.

    PubMed

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  10. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  11. Determination of MLC model parameters for Monaco using commercial diode arrays.

    PubMed

    Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian

    2016-07-08

    Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors

  12. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  13. Two alternative multiplex PCRs for the identification of the seven species of anglerfish (Lophius spp.) using an end-point or a melting curve analysis real-time protocol.

    PubMed

    Castigliego, Lorenzo; Armani, Andrea; Tinacci, Lara; Gianfaldoni, Daniela; Guidi, Alessandra

    2015-01-01

    Anglerfish (Lophius spp.) is consumed worldwide and is an important economic resource though its seven species are often fraudulently interchanged due to their different commercial value, especially when sold in the form of fillets or pieces. Molecular analysis is the only possible mean to verify traceability and counteract fraud. We developed two multiplex PCRs, one end-point and one real-time with melting curve post-amplification analysis, which can even be run with the simplest two-channel thermocyclers. The two methods were tested on seventy-five reference samples. Their specificity was checked in twenty more species of those most commonly available on the market and in other species of the Lophiidae family. Both methods, the choice of which depends on the equipment and budget of the lab, provide a rapid and easy-to-read response, improving both the simplicity and cost-effectiveness of existing methods for identifying Lophius species. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Ionizing Radiation Measurement Solution in a Hospital Environment

    PubMed Central

    Garcia-Sanchez, Antonio-Javier; Garcia Angosto, Enrique Angel; Moreno Riquelme, Pedro Antonio; Serna Berna, Alfredo; Ramos-Amores, David

    2018-01-01

    Ionizing radiation is one of the main risks affecting healthcare workers and patients worldwide. Special attention has to be paid to medical staff in the vicinity of radiological equipment or patients undergoing radioisotope procedures. To measure radiation values, traditional area meters are strategically placed in hospitals and personal dosimeters are worn by workers. However, important drawbacks inherent to these systems in terms of cost, detection precision, real time data processing, flexibility, and so on, have been detected and carefully detailed. To overcome these inconveniences, a low cost, open-source, portable radiation measurement system is proposed. The goal is to deploy devices integrating a commercial Geiger-Muller (GM) detector to capture radiation doses in real time and to wirelessly dispatch them to a remote database where the radiation values are stored. Medical staff will be able to check the accumulated doses first hand, as well as other statistics related to radiation by means of a smartphone application. Finally, the device is certified by an accredited calibration center, to later validate the entire system in a hospital environment. PMID:29419769

  15. Reliability of sensor-based real-time workflow recognition in laparoscopic cholecystectomy.

    PubMed

    Kranzfelder, Michael; Schneider, Armin; Fiolka, Adam; Koller, Sebastian; Reiser, Silvano; Vogel, Thomas; Wilhelm, Dirk; Feussner, Hubertus

    2014-11-01

    Laparoscopic cholecystectomy is a very common minimally invasive surgical procedure that may be improved by autonomous or cooperative assistance support systems. Model-based surgery with a precise definition of distinct procedural tasks (PT) of the operation was implemented and tested to depict and analyze the process of this procedure. Reliability of real-time workflow recognition in laparoscopic cholecystectomy ([Formula: see text] cases) was evaluated by continuous sensor-based data acquisition. Ten PTs were defined including begin/end preparation calots' triangle, clipping/cutting cystic artery and duct, begin/end gallbladder dissection, begin/end hemostasis, gallbladder removal, and end of operation. Data acquisition was achieved with continuous instrument detection, room/table light status, intra-abdominal pressure, table tilt, irrigation/aspiration volume and coagulation/cutting current application. Two independent observers recorded start and endpoint of each step by analysis of the sensor data. The data were cross-checked with laparoscopic video recordings serving as gold standard for PT identification. Bland-Altman analysis revealed for 95% of cases a difference of annotation results within the limits of agreement ranging from [Formula: see text]309 s (PT 7) to +368 s (PT 5). Laparoscopic video and sensor data matched to a greater or lesser extent within the different procedural tasks. In the majority of cases, the observer results exceeded those obtained from the laparoscopic video. Empirical knowledge was required to detect phase transit. A set of sensors used to monitor laparoscopic cholecystectomy procedures was sufficient to enable expert observers to reliably identify each PT. In the future, computer systems may automate the task identification process provided a more robust data inflow is available.

  16. Comparative study of predicted and experimentally detected interplanetary shocks

    NASA Astrophysics Data System (ADS)

    Kartalev, M. D.; Grigorov, K. G.; Smith, Z.; Dryer, M.; Fry, C. D.; Sun, Wei; Deehr, C. S.

    2002-03-01

    We compare the real time space weather prediction shock arrival times at 1 AU made by the USAF/NOAA Shock Time of Arrival (STOA) and Interplanetary Shock Propagation Model (ISPM) models, and the Exploration Physics International/University of Alaska Hakamada-Akasofu-Fry Solar Wind Model (HAF-v2) to a real time analysis analysis of plasma and field ACE data. The comparison is made using an algorithm that was developed on the basis of wavelet data analysis and MHD identification procedure. The shock parameters are estimated for selected "candidate events". An appropriate automatically performing Web-based interface periodically utilizes solar wind observations made by the ACE at L1. Near real time results as well an archive of the registered interesting events are available on a specially developed web site. A number of events are considered. These studies are essential for the validation of real time space weather forecasts made from solar data.

  17. Modeling criterion shifts and target checking in prospective memory monitoring.

    PubMed

    Horn, Sebastian S; Bayen, Ute J

    2015-01-01

    Event-based prospective memory (PM) involves remembering to perform intended actions after a delay. An important theoretical issue is whether and how people monitor the environment to execute an intended action when a target event occurs. Performing a PM task often increases the latencies in ongoing tasks. However, little is known about the reasons for this cost effect. This study uses diffusion model analysis to decompose monitoring processes in the PM paradigm. Across 4 experiments, performing a PM task increased latencies in an ongoing lexical decision task. A large portion of this effect was explained by consistent increases in boundary separation; additional increases in nondecision time emerged in a nonfocal PM task and explained variance in PM performance (Experiment 1), likely reflecting a target-checking strategy before and after the ongoing decision (Experiment 2). However, we found that possible target-checking strategies may depend on task characteristics. That is, instructional emphasis on the importance of ongoing decisions (Experiment 3) or the use of focal targets (Experiment 4) eliminated the contribution of nondecision time to the cost of PM, but left participants in a mode of increased cautiousness. The modeling thus sheds new light on the cost effect seen in many PM studies and suggests that people approach ongoing activities more cautiously when they need to remember an intended action. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  18. Accounting for large deformations in real-time simulations of soft tissues based on reduced-order models.

    PubMed

    Niroomandi, S; Alfaro, I; Cueto, E; Chinesta, F

    2012-01-01

    Model reduction techniques have shown to constitute a valuable tool for real-time simulation in surgical environments and other fields. However, some limitations, imposed by real-time constraints, have not yet been overcome. One of such limitations is the severe limitation in time (established in 500Hz of frequency for the resolution) that precludes the employ of Newton-like schemes for solving non-linear models as the ones usually employed for modeling biological tissues. In this work we present a technique able to deal with geometrically non-linear models, based on the employ of model reduction techniques, together with an efficient non-linear solver. Examples of the performance of the technique over some examples will be given. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Real-time PCR machine system modeling and a systematic approach for the robust design of a real-time PCR-on-a-chip system.

    PubMed

    Lee, Da-Sheng

    2010-01-01

    Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.

  20. Functional Fault Modeling Conventions and Practices for Real-Time Fault Isolation

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Brown, Barbara

    2010-01-01

    The purpose of this paper is to present the conventions, best practices, and processes that were established based on the prototype development of a Functional Fault Model (FFM) for a Cryogenic System that would be used for real-time Fault Isolation in a Fault Detection, Isolation, and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using a suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FFMs were created offline but would eventually be used by a real-time reasoner to isolate faults in a Cryogenic System. Through their development and review, a set of modeling conventions and best practices were established. The prototype FFM development also provided a pathfinder for future FFM development processes. This paper documents the rationale and considerations for robust FFMs that can easily be transitioned to a real-time operating environment.

  1. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  2. An easy game for frauds? Effects of professional experience and time pressure on passport-matching performance.

    PubMed

    Wirth, Benedikt Emanuel; Carbon, Claus-Christian

    2017-06-01

    Despite extensive research on unfamiliar face matching, little is known about factors that might affect matching performance in real-life scenarios. We conducted 2 experiments to investigate the effects of several such factors on unfamiliar face-matching performance in a passport-check scenario. In Experiment 1, we assessed the effect of professional experience on passport-matching performance. The matching performance of 96 German Federal Police officers working at Munich Airport was compared with that of 48 novices without specific face-matching experience. Police officers significantly outperformed novices, but nevertheless missed a high ratio of frauds. Moreover, the effects of manipulating specific facial features (with paraphernalia like glasses and jewelry, distinctive features like moles and scars, and hairstyle) and of variations in the physical distance between the faces being matched were investigated. Whereas manipulation of physical distance did not have a significant effect, manipulations of facial features impaired matching performance. In Experiment 2, passport-matching performance was assessed in relation to time constraints. Novices matched passports either without time constraints, or under a local time limit (which is typically used in laboratory studies), or under a global time limit (which usually occurs during real-life border controls). Time pressure (especially the global time limit) significantly impaired matching performance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Current Use of Underage Alcohol Compliance Checks by Enforcement Agencies in the U.S.

    PubMed Central

    Erickson, Darin J.; Lenk, Kathleen M.; Sanem, Julia R.; Nelson, Toben F.; Jones-Webb, Rhonda; Toomey, Traci L.

    2014-01-01

    Background Compliance checks conducted by law enforcement agents can significantly reduce the likelihood of illegal alcohol sales to underage individuals, but these checks need to be conducted using optimal methods to maintain effectiveness. Materials and Methods We conducted a national survey of local and state enforcement agencies in 2010–2011 to assess: (1) how many agencies are currently conducting underage alcohol compliance checks, (2) how many agencies that conduct compliance checks use optimal methods—including checking all establishments in the jurisdiction, conducting checks at least 3–4 times per year, conducting follow-up checks within 3 months, and penalizing the licensee (not only the server/clerk) for failing a compliance check, and (3) characteristics of the agencies that conduct compliance checks. Results Just over one third of local law enforcement agencies and over two thirds of state agencies reported conducting compliance checks. However, only a small percentage of the agencies (4–6%) reported using all of the optimal methods to maximize effectiveness of these compliance checks. Local law enforcement agencies with an alcohol-related division, those with at least one full-time officer assigned to work on alcohol, and those in larger communities were significantly more likely to conduct compliance checks. State agencies with more full-time agents and those located in states where the state agency or both state and local enforcement agencies have primary responsibility (vs. only the local law agency) for enforcing alcohol retail laws were also more likely to conduct compliance checks; however, these agency characteristics did not remain statistically significant in the multivariate analyses. Conclusions Continued effort is needed to increase the number of local and state agencies conducting compliance checks using optimal methods to reduce youth access to alcohol. PMID:24716443

  4. Current use of underage alcohol compliance checks by enforcement agencies in the United States.

    PubMed

    Erickson, Darin J; Lenk, Kathleen M; Sanem, Julia R; Nelson, Toben F; Jones-Webb, Rhonda; Toomey, Traci L

    2014-06-01

    Compliance checks conducted by law enforcement agents can significantly reduce the likelihood of illegal alcohol sales to underage individuals, but these checks need to be conducted using optimal methods to maintain effectiveness. We conducted a national survey of local and state enforcement agencies from 2010 to 2011 to assess: (i) how many agencies are currently conducting underage alcohol compliance checks, (ii) how many agencies that conduct compliance checks use optimal methods-including checking all establishments in the jurisdiction, conducting checks at least 3 to 4 times per year, conducting follow-up checks within 3 months, and penalizing the licensee (not only the server/clerk) for failing a compliance check, and (iii) characteristics of the agencies that conduct compliance checks. Just over one-third of local law enforcement agencies and over two-thirds of state agencies reported conducting compliance checks. However, only a small percentage of the agencies (4 to 6%) reported using all of the optimal methods to maximize effectiveness of these compliance checks. Local law enforcement agencies with an alcohol-related division, those with at least 1 full-time officer assigned to work on alcohol, and those in larger communities were significantly more likely to conduct compliance checks. State agencies with more full-time agents and those located in states where the state agency or both state and local enforcement agencies have primary responsibility (vs. only the local law agency) for enforcing alcohol retail laws were also more likely to conduct compliance checks; however, these agency characteristics did not remain statistically significant in the multivariate analyses. Continued effort is needed to increase the number of local and state agencies conducting compliance checks using optimal methods to reduce youth access to alcohol. Copyright © 2014 by the Research Society on Alcoholism.

  5. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  6. Fatigue design procedure for the American SST prototype

    NASA Technical Reports Server (NTRS)

    Doty, R. J.

    1972-01-01

    For supersonic airline operations, significantly higher environmental temperature is the primary new factor affecting structural service life. Methods for incorporating the influence of temperature in detailed fatigue analyses are shown along with current test indications. Thermal effects investigated include real-time compared with short-time testing, long-time temperature exposure, and stress-temperature cycle phasing. A method is presented which allows designers and stress analyzers to check fatigue resistance of structural design details. A communicative rating system is presented which defines the relative fatigue quality of the detail so that the analyst can define cyclic-load capability of the design detail by entering constant-life charts for varying detail quality. If necessary then, this system allows the designer to determine ways to improve the fatigue quality for better life or to determine the operating stresses which will provide the required service life.

  7. Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.

    PubMed

    Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy

    2015-12-30

    While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data

    NASA Technical Reports Server (NTRS)

    Brandon, Jay M.; Morelli, Eugene A.

    2014-01-01

    Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.

  9. Near real-time imaging of molasses injections using time-lapse electrical geophysics at the Brandywine DRMO, Brandywine, Maryland

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Johnson, T.; Major, B.; Day-Lewis, F. D.; Lane, J. W.

    2010-12-01

    Enhanced bioremediation, which involves introduction of amendments to promote biodegradation, increasingly is used to accelerate cleanup of recalcitrant compounds and has been identified as the preferred remedial treatment at many contaminated sites. Although blind introduction of amendments can lead to sub-optimal or ineffective remediation, the distribution of amendment throughout the treatment zone is difficult to measure using conventional sampling. Because amendments and their degradation products commonly have electrical properties that differ from those of ambient soil, time-lapse electrical geophysical monitoring has the potential to verify amendment emplacement and distribution. In order for geophysical monitoring to be useful, however, results of the injection ideally should be accessible in near real time. In August 2010, we demonstrated the feasibility of near real-time, autonomous electrical geophysical monitoring of amendment injections at the former Defense Reutilization and Marketing Office (DRMO) in Brandywine, Maryland. Two injections of about 1000 gallons each of molasses, a widely used amendment for enhanced bioremediation, were monitored using measurements taken with borehole and surface electrodes. During the injections, multi-channel resistance data were recorded; data were transmitted to a server and processed using a parallel resistivity inversion code; and results in the form of time-lapse imagery subsequently were posted to a website. This process occurred automatically without human intervention. The resulting time-lapse imagery clearly showed the evolution of the molasses plume. The delay between measurements and online delivery of images was between 45 and 60 minutes, thus providing actionable information that could support decisions about field procedures and a check on whether amendment reached target zones. This experiment demonstrates the feasibility of using electrical imaging as a monitoring tool both during amendment emplacement and post-injection to track amendment distribution, geochemical breakdown, and other remedial effects.

  10. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  11. Socioeconomic differences in health check-ups and medically certified sickness absence: a 10-year follow-up among middle-aged municipal employees in Finland.

    PubMed

    Piha, Kustaa; Sumanen, Hilla; Lahelma, Eero; Rahkonen, Ossi

    2017-04-01

    There is contradictory evidence on the association between health check-ups and future morbidity. Among the general population, those with high socioeconomic position participate more often in health check-ups. The main aims of this study were to analyse if attendance to health check-ups are socioeconomically patterned and affect sickness absence over a 10-year follow-up. This register-based follow-up study included municipal employees of the City of Helsinki. 13 037 employees were invited to age-based health check-up during 2000-2002, with a 62% attendance rate. Education, occupational class and individual income were used to measure socioeconomic position. Medically certified sickness absence of 4 days or more was measured and controlled for at the baseline and used as an outcome over follow-up. The mean follow-up time was 7.5 years. Poisson regression was used. Men and employees with lower socioeconomic position participated more actively in health check-ups. Among women, non-attendance to health check-up predicted higher sickness absence during follow-up (relative risk =1.26, 95% CI 1.17 to 1.37) in the fully adjusted model. Health check-ups were not effective in reducing socioeconomic differences in sickness absence. Age-based health check-ups reduced subsequent sickness absence and should be promoted. Attendance to health check-ups should be as high as possible. Contextual factors need to be taken into account when applying the results in interventions in other settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  12. Space Weather Forecasting at NOAA with Michigan's Geospace Model: Results from the First Year in Real-Time Operations

    NASA Astrophysics Data System (ADS)

    Cash, M. D.; Singer, H. J.; Millward, G. H.; Balch, C. C.; Toth, G.; Welling, D. T.

    2017-12-01

    In October 2016, the first version of the Geospace model was transitioned into real-time operations at NOAA Space Weather Prediction Center (SWPC). The Geospace model is a part of the Space Weather Modeling Framework (SWMF) developed at the University of Michigan, and the model simulates the full time-dependent 3D Geospace environment (Earth's magnetosphere, ring current and ionosphere) and predicts global space weather parameters such as induced magnetic perturbations in space and on Earth's surface. The current version of the Geospace model uses three coupled components of SWMF: the BATS-R-US global magnetosphere model, the Rice Convection Model (RCM) of the inner magnetosphere, and the Ridley Ionosphere electrodynamics Model (RIM). In the operational mode, SWMF/Geospace runs continually in real-time as long as there is new solar wind data arriving from a satellite at L1, either DSCOVR or ACE. We present an analysis of the overall performance of the Geospace model during the first year of real-time operations. Evaluation metrics include Kp, Dst, as well as regional magnetometer stations. We will also present initial results from new products, such as the AE index, available with the recent upgrade to the Geospace model.

  13. Continuous piecewise-linear, reduced-order electrochemical model for lithium-ion batteries in real-time applications

    NASA Astrophysics Data System (ADS)

    Farag, Mohammed; Fleckenstein, Matthias; Habibi, Saeid

    2017-02-01

    Model-order reduction and minimization of the CPU run-time while maintaining the model accuracy are critical requirements for real-time implementation of lithium-ion electrochemical battery models. In this paper, an isothermal, continuous, piecewise-linear, electrode-average model is developed by using an optimal knot placement technique. The proposed model reduces the univariate nonlinear function of the electrode's open circuit potential dependence on the state of charge to continuous piecewise regions. The parameterization experiments were chosen to provide a trade-off between extensive experimental characterization techniques and purely identifying all parameters using optimization techniques. The model is then parameterized in each continuous, piecewise-linear, region. Applying the proposed technique cuts down the CPU run-time by around 20%, compared to the reduced-order, electrode-average model. Finally, the model validation against real-time driving profiles (FTP-72, WLTP) demonstrates the ability of the model to predict the cell voltage accurately with less than 2% error.

  14. UTP and Temporal Logic Model Checking

    NASA Astrophysics Data System (ADS)

    Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo

    In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures

  15. Research on registration algorithm for check seal verification

    NASA Astrophysics Data System (ADS)

    Wang, Shuang; Liu, Tiegen

    2008-03-01

    Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.

  16. Integrated Test Facility (ITF)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The NASA-Dryden Integrated Test Facility (ITF), also known as the Walter C. Williams Research Aircraft Integration Facility (RAIF), provides an environment for conducting efficient and thorough testing of advanced, highly integrated research aircraft. Flight test confidence is greatly enhanced by the ability to qualify interactive aircraft systems in a controlled environment. In the ITF, each element of a flight vehicle can be regulated and monitored in real time as it interacts with the rest of the aircraft systems. Testing in the ITF is accomplished through automated techniques in which the research aircraft is interfaced to a high-fidelity real-time simulation. Electric and hydraulic power are also supplied, allowing all systems except the engines to function as if in flight. The testing process is controlled by an engineering workstation that sets up initial conditions for a test, initiates the test run, monitors its progress, and archives the data generated. The workstation is also capable of analyzing results of individual tests, comparing results of multiple tests, and producing reports. The computers used in the automated aircraft testing process are also capable of operating in a stand-alone mode with a simulation cockpit, complete with its own instruments and controls. Control law development and modification, aerodynamic, propulsion, guidance model qualification, and flight planning -- functions traditionally associated with real-time simulation -- can all be performed in this manner. The Remotely Augmented Vehicles (RAV) function, now located in the ITF, is a mainstay in the research techniques employed at Dryden. This function is used for tests that are too dangerous for direct human involvement or for which computational capacity does not exist onboard a research aircraft. RAV provides the researcher with a ground-based computer that is radio linked to the test aircraft during actual flight. The Ground Vibration Testing (GVT) system, formerly housed in the Thermostructural Laboratory, now also resides in the ITF. In preparing a research aircraft for flight testing, it is vital to measure its structural frequencies and mode shapes and compare results to the models used in design analysis. The final function performed in the ITF is routine aircraft maintenance. This includes preflight and post-flight instrumentation checks and the servicing of hydraulics, avionics, and engines necessary on any research aircraft. Aircraft are not merely moved to the ITF for automated testing purposes but are housed there throughout their flight test programs.

  17. Integrated Test Facility (ITF)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA-Dryden Integrated Test Facility (ITF), also known as the Walter C. Williams Research Aircraft Integration Facility (RAIF), provides an environment for conducting efficient and thorough testing of advanced, highly integrated research aircraft. Flight test confidence is greatly enhanced by the ability to qualify interactive aircraft systems in a controlled environment. In the ITF, each element of a flight vehicle can be regulated and monitored in real time as it interacts with the rest of the aircraft systems. Testing in the ITF is accomplished through automated techniques in which the research aircraft is interfaced to a high-fidelity real-time simulation. Electric and hydraulic power are also supplied, allowing all systems except the engines to function as if in flight. The testing process is controlled by an engineering workstation that sets up initial conditions for a test, initiates the test run, monitors its progress, and archives the data generated. The workstation is also capable of analyzing results of individual tests, comparing results of multiple tests, and producing reports. The computers used in the automated aircraft testing process are also capable of operating in a stand-alone mode with a simulation cockpit, complete with its own instruments and controls. Control law development and modification, aerodynamic, propulsion, guidance model qualification, and flight planning -- functions traditionally associated with real-time simulation -- can all be performed in this manner. The Remotely Augmented Vehicles (RAV) function, now located in the ITF, is a mainstay in the research techniques employed at Dryden. This function is used for tests that are too dangerous for direct human involvement or for which computational capacity does not exist onboard a research aircraft. RAV provides the researcher with a ground-based computer that is radio linked to the test aircraft during actual flight. The Ground Vibration Testing (GVT) system, formerly housed in the Thermostructural Laboratory, now also resides in the ITF. In preparing a research aircraft for flight testing, it is vital to measure its structural frequencies and mode shapes and compare results to the models used in design analysis. The final function performed in the ITF is routine aircraft maintenance. This includes preflight and post-flight instrumentation checks and the servicing of hydraulics, avionics, and engines necessary on any research aircraft. Aircraft are not merely moved to the ITF for automated testing purposes but are housed there throughout their flight test programs.

  18. Walter C. Williams Research Aircraft Integration Facility (RAIF)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The NASA-Dryden Integrated Test Facility (ITF), also known as the Walter C. Williams Research Aircraft Integration Facility (RAIF), provides an environment for conducting efficient and thorough testing of advanced, highly integrated research aircraft. Flight test confidence is greatly enhanced by the ability to qualify interactive aircraft systems in a controlled environment. In the ITF, each element of a flight vehicle can be regulated and monitored in real time as it interacts with the rest of the aircraft systems. Testing in the ITF is accomplished through automated techniques in which the research aircraft is interfaced to a high-fidelity real-time simulation. Electric and hydraulic power are also supplied, allowing all systems except the engines to function as if in flight. The testing process is controlled by an engineering workstation that sets up initial conditions for a test, initiates the test run, monitors its progress, and archives the data generated. The workstation is also capable of analyzing results of individual tests, comparing results of multiple tests, and producing reports. The computers used in the automated aircraft testing process are also capable of operating in a stand-alone mode with a simulation cockpit, complete with its own instruments and controls. Control law development and modification, aerodynamic, propulsion, guidance model qualification, and flight planning -- functions traditionally associated with real-time simulation -- can all be performed in this manner. The Remotely Augmented Vehicles (RAV) function, now located in the ITF, is a mainstay in the research techniques employed at Dryden. This function is used for tests that are too dangerous for direct human involvement or for which computational capacity does not exist onboard a research aircraft. RAV provides the researcher with a ground-based computer that is radio linked to the test aircraft during actual flight. The Ground Vibration Testing (GVT) system, formerly housed in the Thermostructural Laboratory, now also resides in the ITF. In preparing a research aircraft for flight testing, it is vital to measure its structural frequencies and mode shapes and compare results to the models used in design analysis. The final function performed in the ITF is routine aircraft maintenance. This includes preflight and post-flight instrumentation checks and the servicing of hydraulics, avionics, and engines necessary on any research aircraft. Aircraft are not merely moved to the ITF for automated testing purposes but are housed there throughout their flight test programs.

  19. 40 CFR 60.2770 - What information must I include in my annual report?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and Compliance Times for Commercial and Industrial Solid Waste Incineration Units Model Rule... inoperative, except for zero (low-level) and high-level checks. (3) The date, time, and duration that each... of control if any of the following occur. (1) The zero (low-level), mid-level (if applicable), or...

  20. GlastCam: A Telemetry-Driven Spacecraft Visualization Tool

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric T.; Tsai, Dean

    2009-01-01

    Developed for the GLAST project, which is now the Fermi Gamma-ray Space Telescope, GlastCam software ingests telemetry from the Integrated Test and Operations System (ITOS) and generates four graphical displays of geometric properties in real time, allowing visual assessment of the attitude, configuration, position, and various cross-checks. Four windows are displayed: a "cam" window shows a 3D view of the satellite; a second window shows the standard position plot of the satellite on a Mercator map of the Earth; a third window displays star tracker fields of view, showing which stars are visible from the spacecraft in order to verify star tracking; and the fourth window depicts

  1. 75 FR 28480 - Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... pressurise the hydraulic reservoirs, due to leakage of the Crissair reservoir air pressurisation check valves. * * * The leakage of the check valves was caused by an incorrect spring material. The affected Crissair check valves * * * were then replaced with improved check valves P/N [part number] 2S2794-1 * * *. More...

  2. Measurement of absorption and dispersion from check shot surveys

    NASA Astrophysics Data System (ADS)

    Ganley, D. C.; Kanasewich, E. R.

    1980-10-01

    The spectral ratio method for measuring absorption and also dispersion from seismic data has been examined. Corrections for frequency-dependent losses due to reflections and transmissions have been shown to be an important step in the method. Synthetic examples have been used to illustrate the method, and the method has been applied to one real data case from a sedimentary basin in the Beaufort Sea. Measured Q values were 43±2 for a depth interval of 549-1193 m and 67±6 for a depth interval of 945-1311 m. Dispersion was also measured in the data and is consistent with Futterman's model.

  3. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1991-01-01

    We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.

  4. Cross-Layer Modeling Framework for Energy-Efficient Resilience

    DTIC Science & Technology

    2014-04-01

    functional block diagram of the software architecture of PEARL, which stands for: Power Efficient and Resilient Embedded Processing with Real - Time ... DVFS ). The goal of the run- time manager is to minimize power consumption, while maintaining system resilience targets (on average) and meeting... real - time performance targets. The integrated performance, power and resilience models are nothing but the analytical modeling toolkit described in

  5. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  6. Queueing analysis of a canonical model of real-time multiprocessors

    NASA Technical Reports Server (NTRS)

    Krishna, C. M.; Shin, K. G.

    1983-01-01

    A logical classification of multiprocessor structures from the point of view of control applications is presented. A computation of the response time distribution for a canonical model of a real time multiprocessor is presented. The multiprocessor is approximated by a blocking model. Two separate models are derived: one created from the system's point of view, and the other from the point of view of an incoming task.

  7. The burning fuse model of unbecoming in time

    NASA Astrophysics Data System (ADS)

    Norton, John D.

    2015-11-01

    In the burning fuse model of unbecoming in time, the future is real and the past is unreal. It is used to motivate the idea that there is something unbecoming in the present literature on the metaphysics of time: its focus is merely the assigning of a label "real."

  8. Real time validation of GPS TEC precursor mask for Greece

    NASA Astrophysics Data System (ADS)

    Pulinets, Sergey; Davidenko, Dmitry

    2013-04-01

    It was established by earlier studies of pre-earthquake ionospheric variations that for every specific site these variations manifest definite stability in their temporal behavior within the time interval few days before the seismic shock. This self-similarity (characteristic to phenomena registered for processes observed close to critical point of the system) permits us to consider these variations as a good candidate to short-term precursor. Physical mechanism of GPS TEC variations before earthquakes is developed within the framework of Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model. Taking into account the different tectonic structure and different source mechanisms of earthquakes in different regions of the globe, every site has its individual behavior in pre-earthquake activity what creates individual "imprint" on the ionosphere behavior at every given point. Just this so called "mask" of the ionosphere variability before earthquake in the given point creates opportunity to detect anomalous behavior of electron concentration in ionosphere basing not only on statistical processing procedure but applying the pattern recognition technique what facilitates the automatic recognition of short-term ionospheric precursors of earthquakes. Such kind of precursor mask was created using the GPS TEC variation around the time of 9 earthquakes with magnitude from M6.0 till M6.9 which took place in Greece within the time interval 2006-2011. The major anomaly revealed in the relative deviation of the vertical TEC was the positive anomaly appearing at ~04PM UT one day before the seismic shock and lasting nearly 12 hours till ~04AM UT. To validate this approach it was decided to check the mask in real-time monitoring of earthquakes in Greece starting from the 1 of December 2012 for the earthquakes with magnitude more than 4.5. During this period (till 9 of January 2013) 4 cases of seismic shocks were registered, including the largest one M5.7 on 8 of January. For all of them the mask confirmed its validity and 6 of December event was predicted in advance.

  9. A Scheduling Algorithm for Replicated Real-Time Tasks

    NASA Technical Reports Server (NTRS)

    Yu, Albert C.; Lin, Kwei-Jay

    1991-01-01

    We present an algorithm for scheduling real-time periodic tasks on a multiprocessor system under fault-tolerant requirement. Our approach incorporates both the redundancy and masking technique and the imprecise computation model. Since the tasks in hard real-time systems have stringent timing constraints, the redundancy and masking technique are more appropriate than the rollback techniques which usually require extra time for error recovery. The imprecise computation model provides flexible functionality by trading off the quality of the result produced by a task with the amount of processing time required to produce it. It therefore permits the performance of a real-time system to degrade gracefully. We evaluate the algorithm by stochastic analysis and Monte Carlo simulations. The results show that the algorithm is resilient under hardware failures.

  10. The Waypoint Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Astrophysics Data System (ADS)

    He, M.; Goodman, H. M.; Blakeslee, R.; Hall, J. M.

    2010-12-01

    NASA Earth science research utilizes both spaceborne and airborne real time observations in the planning and operations of its field campaigns. The coordination of air and space components is critical to achieve the goals and objectives and ensure the success of an experiment. Spaceborne imagery provides regular and continual coverage of the Earth and it is a significant component in all NASA field experiments. Real time visible and infrared geostationary images from GOES satellites and multi-spectral data from the many elements of the NASA suite of instruments aboard the TRMM, Terra, Aqua, Aura, and other NASA satellites have become norm. Similarly, the NASA Airborne Science Program draws upon a rich pool of instrumented aircraft. The NASA McDonnell Douglas DC-8, Lockheed P3 Orion, DeHavilland Twin Otter, King Air B200, Gulfstream-III are all staples of a NASA’s well-stocked, versatile hangar. A key component in many field campaigns is coordinating the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions. Given the variables involved, developing a good flight plan that meets the objectives of the field experiment can be a challenging and time consuming task. Planning a research aircraft mission within the context of meeting the science objectives is complex task because it is much more than flying from point A to B. Flight plans typically consist of flying a series of transects or involve dynamic path changes when “chasing” a hurricane or forest fire. These aircraft flight plans are typically designed by the mission scientists then verified and implemented by the navigator or pilot. Flight planning can be an arduous task requiring frequent sanity checks by the flight crew. This requires real time situational awareness of the weather conditions that affect the aircraft track. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool, that enables scientists to develop their own flight plans (also known as waypoints) with point-and-click mouse capabilities on a digital map draped with real time satellite imagery. The Waypoint Planning Tool has further advanced to include satellite orbit predictions and seamlessly interfaces with the Real Time Mission Monitor which tracks the aircraft’s position when the planes are flying. This presentation will describe the capabilities and features of the Waypoint Planning Tool highlighting the real time aspect, interactive nature and the resultant benefits to the airborne science community.

  11. The Waypoint Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Technical Reports Server (NTRS)

    He, Yubin; Blakeslee, Richard; Goodman, Michael; Hall, John

    2010-01-01

    NASA Earth science research utilizes both spaceborne and airborne real time observations in the planning and operations of its field campaigns. The coordination of air and space components is critical to achieve the goals and objectives and ensure the success of an experiment. Spaceborne imagery provides regular and continual coverage of the Earth and it is a significant component in all NASA field experiments. Real time visible and infrared geostationary images from GOES satellites and multi-spectral data from the many elements of the NASA suite of instruments aboard the TRMM, Terra, Aqua, Aura, and other NASA satellites have become norm. Similarly, the NASA Airborne Science Program draws upon a rich pool of instrumented aircraft. The NASA McDonnell Douglas DC-8, Lockheed P3 Orion, DeHavilland Twin Otter, King Air B200, Gulfstream-III are all staples of a NASA's well-stocked, versatile hangar. A key component in many field campaigns is coordinating the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions. Given the variables involved, developing a good flight plan that meets the objectives of the field experiment can be a challenging and time consuming task. Planning a research aircraft mission within the context of meeting the science objectives is complex task because it is much more than flying from point A to B. Flight plans typically consist of flying a series of transects or involve dynamic path changes when "chasing" a hurricane or forest fire. These aircraft flight plans are typically designed by the mission scientists then verified and implemented by the navigator or pilot. Flight planning can be an arduous task requiring frequent sanity checks by the flight crew. This requires real time situational awareness of the weather conditions that affect the aircraft track. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool, that enables scientists to develop their own flight plans (also known as waypoints) with point-and-click mouse capabilities on a digital map draped with real time satellite imagery. The Waypoint Planning Tool has further advanced to include satellite orbit predictions and seamlessly interfaces with the Real Time Mission Monitor which tracks the aircraft s position when the planes are flying. This presentation will describe the capabilities and features of the Waypoint Planning Tool highlighting the real time aspect, interactive nature and the resultant benefits to the airborne science community.

  12. Implementation of real-time energy management strategy based on reinforcement learning for hybrid electric vehicles and simulation validation

    PubMed Central

    Kong, Zehui; Liu, Teng

    2017-01-01

    To further improve the fuel economy of series hybrid electric tracked vehicles, a reinforcement learning (RL)-based real-time energy management strategy is developed in this paper. In order to utilize the statistical characteristics of online driving schedule effectively, a recursive algorithm for the transition probability matrix (TPM) of power-request is derived. The reinforcement learning (RL) is applied to calculate and update the control policy at regular time, adapting to the varying driving conditions. A facing-forward powertrain model is built in detail, including the engine-generator model, battery model and vehicle dynamical model. The robustness and adaptability of real-time energy management strategy are validated through the comparison with the stationary control strategy based on initial transition probability matrix (TPM) generated from a long naturalistic driving cycle in the simulation. Results indicate that proposed method has better fuel economy than stationary one and is more effective in real-time control. PMID:28671967

  13. Implementation of real-time energy management strategy based on reinforcement learning for hybrid electric vehicles and simulation validation.

    PubMed

    Kong, Zehui; Zou, Yuan; Liu, Teng

    2017-01-01

    To further improve the fuel economy of series hybrid electric tracked vehicles, a reinforcement learning (RL)-based real-time energy management strategy is developed in this paper. In order to utilize the statistical characteristics of online driving schedule effectively, a recursive algorithm for the transition probability matrix (TPM) of power-request is derived. The reinforcement learning (RL) is applied to calculate and update the control policy at regular time, adapting to the varying driving conditions. A facing-forward powertrain model is built in detail, including the engine-generator model, battery model and vehicle dynamical model. The robustness and adaptability of real-time energy management strategy are validated through the comparison with the stationary control strategy based on initial transition probability matrix (TPM) generated from a long naturalistic driving cycle in the simulation. Results indicate that proposed method has better fuel economy than stationary one and is more effective in real-time control.

  14. Hardware-in-the-Loop Power Extraction Using Different Real-Time Platforms (PREPRINT)

    DTIC Science & Technology

    2008-07-01

    engine controller ( FADEC ). Incorporating various transient subsystem level models into a complex modeling tool can be a challenging process when each...used can also be modified or replaced as appropriate. In its current configuration, the generic turbine engine model’s FADEC runs primarily on a...simulation in real-time, two platforms were tested: dSPACE and National Instruments’ (NI) LabVIEW Real-Time. For both dSPACE and NI, the engine and FADEC

  15. Robust Real-Time Musculoskeletal Modeling Driven by Electromyograms.

    PubMed

    Durandau, Guillaume; Farina, Dario; Sartori, Massimo

    2018-03-01

    Current clinical biomechanics involves lengthy data acquisition and time-consuming offline analyses with biomechanical models not operating in real-time for man-machine interfacing. We developed a method that enables online analysis of neuromusculoskeletal function in vivo in the intact human. We used electromyography (EMG)-driven musculoskeletal modeling to simulate all transformations from muscle excitation onset (EMGs) to mechanical moment production around multiple lower-limb degrees of freedom (DOFs). We developed a calibration algorithm that enables adjusting musculoskeletal model parameters specifically to an individual's anthropometry and force-generating capacity. We incorporated the modeling paradigm into a computationally efficient, generic framework that can be interfaced in real-time with any movement data collection system. The framework demonstrated the ability of computing forces in 13 lower-limb muscle-tendon units and resulting moments about three joint DOFs simultaneously in real-time. Remarkably, it was capable of extrapolating beyond calibration conditions, i.e., predicting accurate joint moments during six unseen tasks and one unseen DOF. The proposed framework can dramatically reduce evaluation latency in current clinical biomechanics and open up new avenues for establishing prompt and personalized treatments, as well as for establishing natural interfaces between patients and rehabilitation systems. The integration of EMG with numerical modeling will enable simulating realistic neuromuscular strategies in conditions including muscular/orthopedic deficit, which could not be robustly simulated via pure modeling formulations. This will enable translation to clinical settings and development of healthcare technologies including real-time bio-feedback of internal mechanical forces and direct patient-machine interfacing.

  16. Assessment of check-dam groundwater recharge with water-balance calculations

    NASA Astrophysics Data System (ADS)

    Djuma, Hakan; Bruggeman, Adriana; Camera, Corrado; Eliades, Marinos

    2017-04-01

    Studies on the enhancement of groundwater recharge by check-dams in arid and semi-arid environments mainly focus on deriving water infiltration rates from the check-dam ponding areas. This is usually achieved by applying simple water balance models, more advanced models (e.g., two dimensional groundwater models) and field tests (e.g., infiltrometer test or soil pit tests). Recharge behind the check-dam can be affected by the built-up of sediment as a result of erosion in the upstream watershed area. This natural process can increase the uncertainty in the estimates of the recharged water volume, especially for water balance calculations. Few water balance field studies of individual check-dams have been presented in the literature and none of them presented associated uncertainties of their estimates. The objectives of this study are i) to assess the effect of a check-dam on groundwater recharge from an ephemeral river; and ii) to assess annual sedimentation at the check-dam during a 4-year period. The study was conducted on a check-dam in the semi-arid island of Cyprus. Field campaigns were carried out to measure water flow, water depth and check-dam topography in order to establish check-dam water height, volume, evaporation, outflow and recharge relations. Topographic surveys were repeated at the end of consecutive hydrological years to estimate the sediment built up in the reservoir area of the check dam. Also, sediment samples were collected from the check-dam reservoir area for bulk-density analyses. To quantify the groundwater recharge, a water balance model was applied at two locations: at the check-dam and corresponding reservoir area, and at a 4-km stretch of the river bed without check-dam. Results showed that a check-dam with a storage capacity of 25,000 m3 was able to recharge to the aquifer, in four years, a total of 12 million m3 out of the 42 million m3 of measured (or modelled) streamflow. Recharge from the analyzed 4-km long river section without check-dam was estimated to be 1 million m3. Upper and lower limits of prediction intervals were computed to assess the uncertainties of the results. The model was rerun with these values and resulted in recharge values of 0.4 m3 as lower and 38 million m3 as upper limit. The sediment survey in the check-dam reservoir area showed that the reservoir area was filled with 2,000 to 3,000 tons of sediment after one rainfall season. This amount of sediment corresponds to 0.2 to 2 t h-1 y-1 sediment yield at the watershed level and reduces the check-dam storage capacity by approximately 10%. Results indicate that check-dams are valuable structures for increasing groundwater resources, but special attention should be given to soil erosion occurring in the upstream area and the resulting sediment built-up in the check-dam reservoir area. This study has received funding from the EU FP7 RECARE Project (GA 603498)

  17. Real Time Fire Reconnaissance Satellite Monitoring System Failure Model

    NASA Astrophysics Data System (ADS)

    Nino Prieto, Omar Ariosto; Colmenares Guillen, Luis Enrique

    2013-09-01

    In this paper the Real Time Fire Reconnaissance Satellite Monitoring System is presented. This architecture is a legacy of the Detection System for Real-Time Physical Variables which is undergoing a patent process in Mexico. The methodologies for this design are the Structured Analysis for Real Time (SA- RT) [8], and the software is carried out by LACATRE (Langage d'aide à la Conception d'Application multitâche Temps Réel) [9,10] Real Time formal language. The system failures model is analyzed and the proposal is based on the formal language for the design of critical systems and Risk Assessment; AltaRica. This formal architecture uses satellites as input sensors and it was adapted from the original model which is a design pattern for physical variation detection in Real Time. The original design, whose task is to monitor events such as natural disasters and health related applications, or actual sickness monitoring and prevention, as the Real Time Diabetes Monitoring System, among others. Some related work has been presented on the Mexican Space Agency (AEM) Creation and Consultation Forums (2010-2011), and throughout the International Mexican Aerospace Science and Technology Society (SOMECYTA) international congress held in San Luis Potosí, México (2012). This Architecture will allow a Real Time Fire Satellite Monitoring, which will reduce the damage and danger caused by fires which consumes the forests and tropical forests of Mexico. This new proposal, permits having a new system that impacts on disaster prevention, by combining national and international technologies and cooperation for the benefit of humankind.

  18. Real-Time Global Flood Estimation Using Satellite-Based Precipitation and a Coupled Land Surface and Routing Model

    NASA Technical Reports Server (NTRS)

    Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian

    2014-01-01

    A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50 deg. N - 50 deg. S at relatively high spatial (approximately 12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is approximately 0.9 and the false alarm ratio is approximately 0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30 deg. S - 30 deg. N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. There were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.

  19. Real-time global flood estimation using satellite-based precipitation and a coupled land surface and routing model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Huan; Adler, Robert F.; Tian, Yudong

    2014-03-01

    A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50°N–50°S at relatively high spatial (~12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS,more » the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is ~0.9 and the false alarm ratio is ~0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30°S–30°N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. Finally, there were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.« less

  20. pcr: an R package for quality assessment, analysis and testing of qPCR data

    PubMed Central

    Ahmed, Mahmoud

    2018-01-01

    Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953

  1. An X-Band Radar Terrain Feature Detection Method for Low-Altitude SVS Operations and Calibration Using LiDAR

    NASA Technical Reports Server (NTRS)

    Young, Steve; UijtdeHaag, Maarten; Campbell, Jacob

    2004-01-01

    To enable safe use of Synthetic Vision Systems at low altitudes, real-time range-to-terrain measurements may be required to ensure the integrity of terrain models stored in the system. This paper reviews and extends previous work describing the application of x-band radar to terrain model integrity monitoring. A method of terrain feature extraction and a transformation of the features to a common reference domain are proposed. Expected error distributions for the extracted features are required to establish appropriate thresholds whereby a consistency-checking function can trigger an alert. A calibration-based approach is presented that can be used to obtain these distributions. To verify the approach, NASA's DC-8 airborne science platform was used to collect data from two mapping sensors. An Airborne Laser Terrain Mapping (ALTM) sensor was installed in the cargo bay of the DC-8. After processing, the ALTM produced a reference terrain model with a vertical accuracy of less than one meter. Also installed was a commercial-off-the-shelf x-band radar in the nose radome of the DC-8. Although primarily designed to measure precipitation, the radar also provides estimates of terrain reflectivity at low altitudes. Using the ALTM data as the reference, errors in features extracted from the radar are estimated. A method to estimate errors in features extracted from the terrain model is also presented.

  2. The Evaluation of GPS techniques for UAV-based Photogrammetry in Urban Area

    NASA Astrophysics Data System (ADS)

    Yeh, M. L.; Chou, Y. T.; Yang, L. S.

    2016-06-01

    The efficiency and high mobility of Unmanned Aerial Vehicle (UAV) made them essential to aerial photography assisted survey and mapping. Especially for urban land use and land cover, that they often changes, and need UAVs to obtain new terrain data and the new changes of land use. This study aims to collect image data and three dimensional ground control points in Taichung city area with Unmanned Aerial Vehicle (UAV), general camera and Real-Time Kinematic with positioning accuracy down to centimetre. The study area is an ecological park that has a low topography which support the city as a detention basin. A digital surface model was also built with Agisoft PhotoScan, and there will also be a high resolution orthophotos. There will be two conditions for this study, with or without ground control points and both were discussed and compared for the accuracy level of each of the digital surface models. According to check point deviation estimate, the model without ground control points has an average two-dimension error up to 40 centimeter, altitude error within one meter. The GCP-free RTK-airborne approach produces centimeter-level accuracy with excellent to low risk to the UAS operators. As in the case of the model with ground control points, the accuracy of x, y, z coordinates has gone up 54.62%, 49.07%, and 87.74%, and the accuracy of altitude has improved the most.

  3. A real-time biomimetic acoustic localizing system using time-shared architecture

    NASA Astrophysics Data System (ADS)

    Nourzad Karl, Marianne; Karl, Christian; Hubbard, Allyn

    2008-04-01

    In this paper a real-time sound source localizing system is proposed, which is based on previously developed mammalian auditory models. Traditionally, following the models, which use interaural time delay (ITD) estimates, the amount of parallel computations needed by a system to achieve real-time sound source localization is a limiting factor and a design challenge for hardware implementations. Therefore a new approach using a time-shared architecture implementation is introduced. The proposed architecture is a purely sample-base-driven digital system, and it follows closely the continuous-time approach described in the models. Rather than having dedicated hardware on a per frequency channel basis, a specialized core channel, shared for all frequency bands is used. Having an optimized execution time, which is much less than the system's sample rate, the proposed time-shared solution allows the same number of virtual channels to be processed as the dedicated channels in the traditional approach. Hence, the time-shared approach achieves a highly economical and flexible implementation using minimal silicon area. These aspects are particularly important in efficient hardware implementation of a real time biomimetic sound source localization system.

  4. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  5. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1993-01-01

    PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems.

  6. Sensitivity tests to define the source apportionment performance criteria in the DeltaSA tool

    NASA Astrophysics Data System (ADS)

    Pernigotti, Denise; Belis, Claudio A.

    2017-04-01

    Identification and quantification of the contribution of emission sources to a given area is a key task for the design of abatement strategies. Moreover, European member states are obliged to report this kind of information for zones where the pollution levels exceed the limit values. At present, little is known about the performance and uncertainty of the variety of methodologies used for source apportionment and the comparability between the results of studies using different approaches. The source apportionment Delta (SA Delta) is a tool developed by the EC-JRC to support the particulate matter source apportionment modellers in the identification of sources (for factor analysis studies) and/or in the measure of their performance. The source identification is performed by the tool measuring the proximity of any user chemical profile to preloaded repository data (SPECIATE and SPECIEUROPE). The model performances criteria are based on standard statistical indexes calculated by comparing participants' source contribute estimates and their time series with preloaded references data. Those preloaded data refer to previous European SA intercomparison exercises: the first with real world data (22 participants), the second with synthetic data (25 participants) and the last with real world data which was also extended to Chemical Transport Models (38 receptor models and 4 CTMs). The references used for the model performances are 'true' (predefined by JRC) for the synthetic while they are calculated as ensemble average of the participants' results in real world intercomparisons. The candidates used for each source ensemble reference calculation were selected among participants results based on a number of consistency checks plus the similarity between their chemical profiles to the repository measured data. The estimation of the ensemble reference uncertainty is crucial in order to evaluate the users' performances against it. For this reason a sensitivity analysis on different methods to estimate the ensemble references' uncertainties was performed re-analyzing the synthetic intercomparison dataset, the only one where 'true' reference and ensemble reference contributions were both present. The Delta SA is now available on-line and will be presented, with a critical discussion of the sensitivity analysis on the ensemble reference uncertainty. In particular the grade of among participants mutual agreement on the presence of a certain source should be taken into account. Moreover also the importance of the synthetic intercomparisons in order to catch receptor models common biases will be stressed.

  7. Diagnosis of delay-deadline failures in real time discrete event models.

    PubMed

    Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha

    2007-10-01

    In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.

  8. Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation

    PubMed Central

    Broumandan, Ali; Lachapelle, Gérard

    2018-01-01

    Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated. PMID:29695064

  9. Spoofing Detection Using GNSS/INS/Odometer Coupling for Vehicular Navigation.

    PubMed

    Broumandan, Ali; Lachapelle, Gérard

    2018-04-24

    Location information is one of the most vital information required to achieve intelligent and context-aware capability for various applications such as driverless cars. However, related security and privacy threats are a major holdback. With increasing focus on using Global Navigation Satellite Systems (GNSS) for autonomous navigation and related applications, it is important to provide robust navigation solutions, yet signal spoofing for illegal or covert transportation and misleading receiver timing is increasing and now frequent. Hence, detection and mitigation of spoofing attacks has become an important topic. Several contributions on spoofing detection have been made, focusing on different layers of a GNSS receiver. This paper focuses on spoofing detection utilizing self-contained sensors, namely inertial measurement units (IMUs) and vehicle odometer outputs. A spoofing detection approach based on a consistency check between GNSS and IMU/odometer mechanization is proposed. To detect a spoofing attack, the method analyses GNSS and IMU/odometer measurements independently during a pre-selected observation window and cross checks the solutions provided by GNSS and inertial navigation solution (INS)/odometer mechanization. The performance of the proposed method is verified in real vehicular environments. Mean spoofing detection time and detection performance in terms of receiver operation characteristics (ROC) in sub-urban and dense urban environments are evaluated.

  10. A voice-actuated wind tunnel model leak checking system

    NASA Technical Reports Server (NTRS)

    Larson, William E.

    1989-01-01

    A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.

  11. Study on Amortization Time and Rationality in Real Estate Investment

    NASA Astrophysics Data System (ADS)

    Li, Yancang; Zhou, Shujing; Suo, Juanjuan

    Amortization time and rationality has been discussed a lot in real estate investment research. As the price of real estate is driven by Geometric Brown Motion (GBM), whether the mortgagors should amortize in advance has become a key issue in amortization time research. This paper presents a new method to solve the problem by using the optimal stopping time theory and option pricing theory models. We discuss the option value in amortizing decision based on this model. A simulation method is used to test this method.

  12. Five years' experience of classical swine fever polymerase chain reaction ring trials in France.

    PubMed

    Po, F; Le Dimna, M; Le Potier, M F

    2011-12-01

    Since 2004, the French National Reference Laboratory for classical swine fever (CSF) has conducted an annual proficiency test (PT) to evaluate the ability of local veterinary laboratories to perform real-time polymerase chain reaction (PCR) for CSF virus. The results of five years of testing (2004-2008) are described here. The PT was conducted under blind conditions on 20 samples. The same batch of samples was used for all five years. The number of laboratories that analysed the samples increased from four in 2004 to 13 in 2008. The results of the PT showed the following: cross-contamination between samples and deficiencies in RNA preparation can occur even in experienced laboratories; sample homogeneity should be checked carefully before selection; samples stored at-80 degrees C for several years remain stable; and poor shipment conditions do not damage the samples with regard to detection of CSF virus genome. These results will enable redesign of the panel to improve the overall quality of the PT, which will encourage laboratories to check and improve their PCR procedures and expertise. This is an excellent way to determine laboratory performance.

  13. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    NASA Astrophysics Data System (ADS)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  14. V-Alert: Description and Validation of a Vulnerable Road User Alert System in the Framework of a Smart City

    PubMed Central

    Hernandez-Jayo, Unai; De-la-Iglesia, Idoia; Perez, Jagoba

    2015-01-01

    V-Alert is a cooperative application to be deployed in the frame of Smart Cities with the aim of reducing the probability of accidents involving Vulnerable Road Users (VRU) and vehicles. The architecture of V-Alert combines short- and long-range communication technologies in order to provide more time to the drivers and VRU to take the appropriate maneuver and avoid a possible collision. The information generated by mobile sensors (vehicles and cyclists) is sent over this heterogeneous communication architecture and processed in a central server, the Drivers Cloud, which is in charge of generating the messages that are shown on the drivers’ and cyclists’ Human Machine Interface (HMI). First of all, V-Alert has been tested in a simulated scenario to check the communications architecture in a complex scenario and, once it was validated, all the elements of V-Alert have been moved to a real scenario to check the application reliability. All the results are shown along the length of this paper. PMID:26230695

  15. Simulation model of a gear synchronisation unit for application in a real-time HiL environment

    NASA Astrophysics Data System (ADS)

    Kirchner, Markus; Eberhard, Peter

    2017-05-01

    Gear shifting simulations using the multibody system approach and the finite-element method are standard in the development of transmissions. However, the corresponding models are typically large due to the complex geometries and numerous contacts, which causes long calculation times. The present work sets itself apart from these detailed shifting simulations by proposing a much simpler but powerful synchronisation model which can be computed in real-time while it is still more realistic than a pure rigid multibody model. Therefore, the model is even used as part of a Hardware-in-the-Loop (HiL) test rig. The proposed real-time capable synchronization model combines the rigid multibody system approach with a multiscale simulation approach. The multibody system approach is suitable for the description of the large motions. The multiscale simulation approach is using also the finite-element method suitable for the analysis of the contact processes. An efficient contact search for the claws of a car transmission synchronisation unit is described in detail which shortens the required calculation time of the model considerably. To further shorten the calculation time, the use of a complex pre-synchronisation model with a nonlinear contour is presented. The model has to provide realistic results with the time-step size of the HiL test rig. To reach this specification, a particularly adapted multirate method for the synchronisation model is shown. Measured results of test rigs of the real-time capable synchronisation model are verified on plausibility. The simulation model is then also used in the HiL test rig for a transmission control unit.

  16. AERIS - applications for the environment : real-time information synthesis : eco-lanes operational scenario modeling report.

    DOT National Transportation Integrated Search

    2014-12-01

    This report constitutes the detailed modeling and evaluation results of the Eco-Lanes Operational Scenario defined by the Applications for the Environment: Real-Time Information Synthesis (AERIS) Program. The Operational Scenario constitutes six appl...

  17. Bucket wheel rehabilitation of ERC 1400-30/7 high-capacity excavators from lignite quarries

    NASA Astrophysics Data System (ADS)

    Vîlceanu, Fl; Iancu, C.

    2016-11-01

    The existence of bucket wheel equipment type ERC 1400-30/7 in lignite quarries with lifetime expired, or in the ultimate life period, together with high cost investments for their replacement, makes rational the efforts made to rehabilitation in order to extend their life. Rehabilitation involves checking operational safety based on relevant expertise of metal structures supporting effective resistance but also the replacement (or modernization) of subassemblies that can increase excavation process productivity, lowering energy consumption, reducing mechanical stresses. This paper proposes an analysis of constructive solution of using a part of the classical bucket wheel, on which are located 9 cutting cups and 9 chargers cups and adding a new part so that the new redesigned bucket-wheel will contain 18 cutting-chargers cups, compared to the classical model. On the CAD model of bucket wheel was performed a static and a dynamic FEA, the results being compared with the yield strength of the material of the entire structure, were checked mechanical stresses in the overall distribution map, and were verified the first 4 vibrating modes the structure compared to real loads. Thus was verified that the redesigned bucket-wheel can accomplish the proposed goals respectively increase excavation process productivity, lowering energy consumption and reducing mechanical stresses.

  18. Real-time logic modelling on SpaceWire

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Ma, Yunpeng; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. However, it cannot meet the deterministic requirement for safety/time critical application in spacecraft, where the delay of real-time (RT) message streams must be guaranteed. Therefore, SpaceWire-D is developed that provides deterministic delivery over a SpaceWire network. Formal analysis and verification of real-time systems is critical to their development and safe implementation, and is a prerequisite for obtaining their safety certification. Failure to meet specified timing constraints such as deadlines in hard real-time systems may lead to catastrophic results. In this paper, a formal verification method, Real-Time Logic (RTL), has been proposed to specify and verify timing properties of SpaceWire-D network. Based on the principal of SpaceWire-D protocol, we firstly analyze the timing properties of fundamental transactions, such as RMAP WRITE, and RMAP READ. After that, the RMAP WRITE transaction structure is modeled in Real-Time Logic (RTL) and Presburger Arithmetic representations. And then, the associated constraint graph and safety analysis is provided. Finally, it is suggested that RTL method can be useful for the protocol evaluation and provision of recommendation for further protocol evolutions.

  19. A finite element-based machine learning approach for modeling the mechanical behavior of the breast tissues under compression in real-time.

    PubMed

    Martínez-Martínez, F; Rupérez-Moreno, M J; Martínez-Sober, M; Solves-Llorens, J A; Lorente, D; Serrano-López, A J; Martínez-Sanchis, S; Monserrat, C; Martín-Guerrero, J D

    2017-11-01

    This work presents a data-driven method to simulate, in real-time, the biomechanical behavior of the breast tissues in some image-guided interventions such as biopsies or radiotherapy dose delivery as well as to speed up multimodal registration algorithms. Ten real breasts were used for this work. Their deformation due to the displacement of two compression plates was simulated off-line using the finite element (FE) method. Three machine learning models were trained with the data from those simulations. Then, they were used to predict in real-time the deformation of the breast tissues during the compression. The models were a decision tree and two tree-based ensemble methods (extremely randomized trees and random forest). Two different experimental setups were designed to validate and study the performance of these models under different conditions. The mean 3D Euclidean distance between nodes predicted by the models and those extracted from the FE simulations was calculated to assess the performance of the models in the validation set. The experiments proved that extremely randomized trees performed better than the other two models. The mean error committed by the three models in the prediction of the nodal displacements was under 2 mm, a threshold usually set for clinical applications. The time needed for breast compression prediction is sufficiently short to allow its use in real-time (<0.2 s). Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Check-Up of Planet Earth at the Turn of the Millennium: Anticipated New Phase in Earth Sciences

    NASA Technical Reports Server (NTRS)

    Kaufman, Y. J.; Ramanathan, V.

    1998-01-01

    Langley's remarkable solar and lunar spectra collected from Mt. Whitney inspired Arrhenius to develop the first quantitative climate model in 1896. In 1999, NASA's Earth Observing AM Satellite (EOS-AM) will repeat Langley's experiment, but for the entire planet, thus pioneering calibrated spectral observations from space. Conceived in response to real environmental problems, EOS-AM, in conjunction with other international satellite efforts, will fill a major gap in current efforts by providing quantitative global data sets with a resolution of few kilometers on the physical, chemical and biological elements of the earth system. Thus, like Langley's data, EOS-AM can revolutionize climate research by inspiring a new generation of climate system models and enable us to assess the human impact on the environment.

Top