Sample records for theoretic model checking

  1. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  2. Pharmacist and Technician Perceptions of Tech-Check-Tech in Community Pharmacy Practice Settings.

    PubMed

    Frost, Timothy P; Adams, Alex J

    2018-04-01

    Tech-check-tech (TCT) is a practice model in which pharmacy technicians with advanced training can perform final verification of prescriptions that have been previously reviewed for appropriateness by a pharmacist. Few states have adopted TCT in part because of the common view that this model is controversial among members of the profession. This article aims to summarize the existing research on pharmacist and technician perceptions of community pharmacy-based TCT. A literature review was conducted using MEDLINE (January 1990 to August 2016) and Google Scholar (January 1990 to August 2016) using the terms "tech* and check," "tech-check-tech," "checking technician," and "accuracy checking tech*." Of the 7 studies identified we found general agreement among both pharmacists and technicians that TCT in community pharmacy settings can be safely performed. This agreement persisted in studies of theoretical TCT models and in studies assessing participants in actual community-based TCT models. Pharmacists who had previously worked with a checking technician were generally more favorable toward TCT. Both pharmacists and technicians in community pharmacy settings generally perceived TCT to be safe, in both theoretical surveys and in surveys following actual TCT demonstration projects. These perceptions of safety align well with the actual outcomes achieved from community pharmacy TCT studies.

  3. Non-equilibrium dog-flea model

    NASA Astrophysics Data System (ADS)

    Ackerson, Bruce J.

    2017-11-01

    We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.

  4. Compositional schedulability analysis of real-time actor-based systems.

    PubMed

    Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan

    2017-01-01

    We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.

  5. Query Language for Location-Based Services: A Model Checking Approach

    NASA Astrophysics Data System (ADS)

    Hoareau, Christian; Satoh, Ichiro

    We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.

  6. REVIEWS OF TOPICAL PROBLEMS: Radio pulsars

    NASA Astrophysics Data System (ADS)

    Beskin, Vasilii S.

    1999-11-01

    Recent theoretical work concerning the magnetosphere of and radio emission from pulsars is reviewed in detail. Taking into account years of little or no cooperation between theory and observation and noting, in particular, that no systematic observations are in fact being made to check theoretical predictions, the key ideas underlying the theory of the pulsar magnetosphere are formulated and new observations aimed at verifying current models are discussed.

  7. Evolution, Nucleosynthesis, and Yields of Low-mass Asymptotic Giant Branch Stars at Different Metallicities. II. The FRUITY Database

    NASA Astrophysics Data System (ADS)

    Cristallo, S.; Piersanti, L.; Straniero, O.; Gallino, R.; Domínguez, I.; Abia, C.; Di Rico, G.; Quintini, M.; Bisterzo, S.

    2011-12-01

    By using updated stellar low-mass stars models, we systematically investigate the nucleosynthesis processes occurring in asymptotic giant branch (AGB) stars. In this paper, we present a database dedicated to the nucleosynthesis of AGB stars: FRANEC Repository of Updated Isotopic Tables & Yields (FRUITY). An interactive Web-based interface allows users to freely download the full (from H to Bi) isotopic composition, as it changes after each third dredge-up (TDU) episode and the stellar yields the models produce. A first set of AGB models, having masses in the range 1.5 <=M/M ⊙ <= 3.0 and metallicities 1 × 10-3 <= Z <= 2 × 10-2, is discussed. For each model, a detailed description of the physical and the chemical evolution is provided. In particular, we illustrate the details of the s-process and we evaluate the theoretical uncertainties due to the parameterization adopted to model convection and mass loss. The resulting nucleosynthesis scenario is checked by comparing the theoretical [hs/ls] and [Pb/hs] ratios to those obtained from the available abundance analysis of s-enhanced stars. On the average, the variation with the metallicity of these spectroscopic indexes is well reproduced by theoretical models, although the predicted spread at a given metallicity is substantially smaller than the observed one. Possible explanations for such a difference are briefly discussed. An independent check of the TDU efficiency is provided by the C-stars luminosity function. Consequently, theoretical C-stars luminosity functions for the Galactic disk and the Magellanic Clouds have been derived. We generally find good agreement with observations.

  8. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  9. Model Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster

  10. Stability of Castering Wheels for Aircraft Landing Gears

    NASA Technical Reports Server (NTRS)

    Kantrowitz, Arthur

    1940-01-01

    A theoretical study was made of the shimmy of castering wheels. The theory is based on the discovery of a phenomenon called kinematic shimmy. Experimental checks, use being made of a model having low-pressure tires, are reported and the applicability of the results to full scale is discussed. Theoretical methods of estimating the spindle viscous damping and the spindle solid friction necessary to avoid shimmy are given. A new method of avoiding shimmy -- lateral freedom -- is introduced.

  11. Model-Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.

  12. Enhanced self-monitoring blood glucose in non-insulin requiring Type 2 diabetes: A qualitative study in primary care.

    PubMed

    Brackney, Dana Elisabeth

    2018-03-31

    To contribute to both theoretical and practical understanding of the role of self-monitoring blood glucose for self-management by describing the experience of people with non-insulin requiring Type 2 diabetes in an enhanced structured self-monitoring blood glucose intervention. The complex context of self-monitoring blood glucose in Type 2 diabetes requires a deeper understanding of the clients' illness experience with structured self-monitoring of blood glucose. Clients' numeracy skills contribute to their response to blood glucose readings. Nurses' use of motivational interviewing to increase clients' regulatory self-efficacy is important to the theoretical perspective of the study. A qualitative descriptive study. A purposive sample of eleven adults recently (<2 years) diagnosed with non-insulin requiring Type 2 diabetes who had experienced a structured self-monitoring blood glucose intervention participated in this study. Audio recordings of semi-structured interviews and photos of logbooks were analyzed for themes using constant comparison and member checking. The illness experience states of Type 2 diabetes include 'Diagnosis', 'Behavior change', and 'Routine checking'. People check blood glucose to confirm their Type 2 diabetes diagnosis, to console their diabetes related fears, to create personal explanations of health behavior's impact on blood glucose, to activate behavior change and to congratulate their diabetes self-management efforts. These findings support the Transtheoretical model's stages of change and change processes. Blood glucose checking strengthens the relationships between theoretical concepts found in Diabetes Self-management Education-Support including: engagement, information sharing, and behavioral support. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Modeling Criterion Shifts and Target Checking in Prospective Memory Monitoring

    ERIC Educational Resources Information Center

    Horn, Sebastian S.; Bayen, Ute J.

    2015-01-01

    Event-based prospective memory (PM) involves remembering to perform intended actions after a delay. An important theoretical issue is whether and how people monitor the environment to execute an intended action when a target event occurs. Performing a PM task often increases the latencies in ongoing tasks. However, little is known about the…

  14. General discussion on g-2

    NASA Astrophysics Data System (ADS)

    Knecht, Marc

    2018-05-01

    Progress made on the theoretical aspects of the standard model contributions to the anomalous magnetic moment of the charged leptons since the first FCCP Workshop on Capri in 2015 is reviewed. Emphasis is in particular given to the various cross-checks that have already become available, or might become available in the future, for several important contributions.

  15. Dynamics of a Class of HIV Infection Models with Cure of Infected Cells in Eclipse Stage.

    PubMed

    Maziane, Mehdi; Lotfi, El Mehdi; Hattaf, Khalid; Yousfi, Noura

    2015-12-01

    In this paper, we propose two HIV infection models with specific nonlinear incidence rate by including a class of infected cells in the eclipse phase. The first model is described by ordinary differential equations (ODEs) and generalizes a set of previously existing models and their results. The second model extends our ODE model by taking into account the diffusion of virus. Furthermore, the global stability of both models is investigated by constructing suitable Lyapunov functionals. Finally, we check our theoretical results with numerical simulations.

  16. Algebraic model checking for Boolean gene regulatory networks.

    PubMed

    Tran, Quoc-Nam

    2011-01-01

    We present a computational method in which modular and Groebner bases (GB) computation in Boolean rings are used for solving problems in Boolean gene regulatory networks (BN). In contrast to other known algebraic approaches, the degree of intermediate polynomials during the calculation of Groebner bases using our method will never grow resulting in a significant improvement in running time and memory space consumption. We also show how calculation in temporal logic for model checking can be done by means of our direct and efficient Groebner basis computation in Boolean rings. We present our experimental results in finding attractors and control strategies of Boolean networks to illustrate our theoretical arguments. The results are promising. Our algebraic approach is more efficient than the state-of-the-art model checker NuSMV on BNs. More importantly, our approach finds all solutions for the BN problems.

  17. Multi-Level Aspects of Social Cohesion of Secondary Schools and Pupils' Feelings of Safety

    ERIC Educational Resources Information Center

    Mooij, Ton; Smeets, Ed; de Wit, Wouter

    2011-01-01

    Background: School safety and corresponding feelings of both pupils and school staff are beginning to receive more and more attention. The social cohesion characteristics of a school may be useful in promoting feelings of safety, particularly in pupils. Aims: To conceptualize theoretically, and check empirically a two-level model of social…

  18. Accurate LC peak boundary detection for ¹⁶O/¹⁸O labeled LC-MS data.

    PubMed

    Cui, Jian; Petritis, Konstantinos; Tegeler, Tony; Petritis, Brianne; Ma, Xuepo; Jin, Yufang; Gao, Shou-Jiang S J; Zhang, Jianqiu Michelle

    2013-01-01

    In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements.

  19. Accurate LC Peak Boundary Detection for 16 O/ 18 O Labeled LC-MS Data

    PubMed Central

    Cui, Jian; Petritis, Konstantinos; Tegeler, Tony; Petritis, Brianne; Ma, Xuepo; Jin, Yufang; Gao, Shou-Jiang (SJ); Zhang, Jianqiu (Michelle)

    2013-01-01

    In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements. PMID:24115998

  20. Model Checking a Byzantine-Fault-Tolerant Self-Stabilizing Protocol for Distributed Clock Synchronization Systems

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2007-01-01

    This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.

  1. Timing analysis by model checking

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  2. The Kramers-Kronig relations for usual and anomalous Poisson-Nernst-Planck models.

    PubMed

    Evangelista, Luiz Roberto; Lenzi, Ervin Kaminski; Barbero, Giovanni

    2013-11-20

    The consistency of the frequency response predicted by a class of electrochemical impedance expressions is analytically checked by invoking the Kramers-Kronig (KK) relations. These expressions are obtained in the context of Poisson-Nernst-Planck usual or anomalous diffusional models that satisfy Poisson's equation in a finite length situation. The theoretical results, besides being successful in interpreting experimental data, are also shown to obey the KK relations when these relations are modified accordingly.

  3. Move me, astonish me… delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates

    NASA Astrophysics Data System (ADS)

    Pelowski, Matthew; Markey, Patrick S.; Forster, Michael; Gerger, Gernot; Leder, Helmut

    2017-07-01

    This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between ;aesthetic; and ;everyday; emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research.

  4. Move me, astonish me… delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates.

    PubMed

    Pelowski, Matthew; Markey, Patrick S; Forster, Michael; Gerger, Gernot; Leder, Helmut

    2017-07-01

    This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between "aesthetic" and "everyday" emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Outer crust of nonaccreting cold neutron stars

    NASA Astrophysics Data System (ADS)

    Rüster, Stefan B.; Hempel, Matthias; Schaffner-Bielich, Jürgen

    2006-03-01

    The properties of the outer crust of nonaccreting cold neutron stars are studied by using modern nuclear data and theoretical mass tables, updating in particular the classic work of Baym, Pethick, and Sutherland. Experimental data from the atomic mass table from Audi, Wapstra, and Thibault of 2003 are used and a thorough comparison of many modern theoretical nuclear models, both relativistic and nonrelativistic, is performed for the first time. In addition, the influences of pairing and deformation are investigated. State-of-the-art theoretical nuclear mass tables are compared to check their differences concerning the neutron drip line, magic neutron numbers, the equation of state, and the sequence of neutron-rich nuclei up to the drip line in the outer crust of nonaccreting cold neutron stars.

  6. The impact of gender on the assessment of body checking behavior.

    PubMed

    Alfano, Lauren; Hildebrandt, Tom; Bannon, Katie; Walker, Catherine; Walton, Kate E

    2011-01-01

    Body checking includes any behavior aimed at global or specific evaluations of appearance characteristics. Men and women are believed to express these behaviors differently, possibly reflecting different socialization. However, there has been no empirical test of the impact of gender on body checking. A total of 1024 male and female college students completed two measures of body checking, the Body Checking Questionnaire and the Male Body Checking Questionnaire. Using multiple group confirmatory factor analysis, differential item functioning (DIF) was explored in a composite of these measures. Two global latent factors were identified (female and male body checking severity), and there were expected gender differences in these factors even after controlling for DIF. Ten items were found to be unbiased by gender and provide a suitable brief measure of body checking for mixed gender research. Practical applications for body checking assessment and theoretical implications are discussed. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Symbolic Heuristic Search for Factored Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.

    2003-01-01

    We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.

  8. Personal, Family and School Influences on Secondary Pupils' Feelings of Safety at School, in the School Surroundings and at Home

    ERIC Educational Resources Information Center

    Mooij, Ton

    2012-01-01

    Different types of variables seem to influence school safety and a pupil's feelings of safety at school. The research question asks which risk and promotive variables should be integrated in a theoretical model to predict a pupil's feelings of safety at school, in the school surroundings and at home; what the outcomes are of an empirical check of…

  9. Outer crust of nonaccreting cold neutron stars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruester, Stefan B.; Hempel, Matthias; Schaffner-Bielich, Juergen

    The properties of the outer crust of nonaccreting cold neutron stars are studied by using modern nuclear data and theoretical mass tables, updating in particular the classic work of Baym, Pethick, and Sutherland. Experimental data from the atomic mass table from Audi, Wapstra, and Thibault of 2003 are used and a thorough comparison of many modern theoretical nuclear models, both relativistic and nonrelativistic, is performed for the first time. In addition, the influences of pairing and deformation are investigated. State-of-the-art theoretical nuclear mass tables are compared to check their differences concerning the neutron drip line, magic neutron numbers, the equationmore » of state, and the sequence of neutron-rich nuclei up to the drip line in the outer crust of nonaccreting cold neutron stars.« less

  10. Virtual deposition plant

    NASA Astrophysics Data System (ADS)

    Tikhonravov, Alexander

    2005-09-01

    A general structure of the software for computational manufacturing experiments is discussed. It is shown that computational experiments can be useful for checking feasibility properties of theoretical designs and for finding the most practical theoretical design for a given production environment.

  11. New accurate theoretical line lists of 12CH4 and 13CH4 in the 0-13400 cm-1 range: Application to the modeling of methane absorption in Titan's atmosphere

    NASA Astrophysics Data System (ADS)

    Rey, Michaël; Nikitin, Andrei V.; Bézard, Bruno; Rannou, Pascal; Coustenis, Athena; Tyuterev, Vladimir G.

    2018-03-01

    The spectrum of methane is very important for the analysis and modeling of Titan's atmosphere but its insufficient knowledge in the near infrared, with the absence of reliable absorption coefficients, is an important limitation. In order to help the astronomer community for analyzing high-quality spectra, we report in the present work the first accurate theoretical methane line lists (T = 50-350 K) of 12CH4 and 13CH4 up to 13400 cm-1 ( > 0.75 μm). These lists are built from extensive variational calculations using our recent ab initio potential and dipole moment surfaces and will be freely accessible via the TheoReTS information system (http://theorets.univ-reims.fr, http://theorets.tsu.ru). Validation of these lists is presented throughout the present paper. For the sample of lines where upper energies were available from published analyses of experimental laboratory 12CH4 spectra, small empirical corrections in positions were introduced that could be useful for future high-resolution applications. We finally apply the TheoRetS line list to model Titan spectra as observed by VIMS and by DISR, respectively onboard Cassini and Huygens. These data are used to check that the TheoReTS line lists are able to model observations. We also make comparisons with other experimental or theoretical line lists. It appears that TheoRetS gives very reliable results better than ExoMol and even than HITRAN2012, except around 1.6 μm where it gives very similar results. We conclude that TheoReTS is suitable to be used for the modeling of planetary radiative transfer and photometry. A re-analysis of spectra recorded by the DISR instrument during the descent of the Huygens probe suggests that the CH4 mixing ratio decreases with altitude in Titan's stratosphere, reaching a value of ∼10-2 above the 110 km altitude.

  12. Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs

    NASA Astrophysics Data System (ADS)

    Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul

    2016-08-01

    Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.

  13. Body checking and body avoidance in eating disorders: Systematic review and meta-analysis.

    PubMed

    Nikodijevic, Alexandra; Buck, Kimberly; Fuller-Tyszkiewicz, Matthew; de Paoli, Tara; Krug, Isabel

    2018-05-01

    This review sought to systematically review and quantify the evidence related to body checking and body avoidance in eating disorders (EDs) to gauge the size of effects, as well as examine potential differences between clinical and nonclinical populations, and between different ED subtypes. PsycINFO, PsycARTICLES, PsycEXTRA, Cochrane Library, and MEDLINE databases were searched for academic literature published until October 2017. A grey literature search was also conducted. Fifty-two studies were identified for the systematic review, of which 34 were eligible for meta-analysis. Only female samples were included in the meta-analysis. ED cases experienced significantly higher body checking (d = 1.26, p < .001) and body avoidance (d = 1.88, p < .001) overall relative to healthy controls, but neither behaviour varied by ED subtype. In nonclinical samples, body checking (r = .60) and body avoidance (r = .56) were significantly correlated with ED pathology (p < .001). These findings support transdiagnostic theoretical models and approaches to ED treatment and early intervention programmes. Copyright © 2018 John Wiley & Sons, Ltd and Eating Disorders Association.

  14. Model of twelve properties of a set of organic solvents with graph-theoretical and/or experimental parameters.

    PubMed

    Pogliani, Lionello

    2010-01-30

    Twelve properties of a highly heterogeneous class of organic solvents have been modeled with a graph-theoretical molecular connectivity modified (MC) method, which allows to encode the core electrons and the hydrogen atoms. The graph-theoretical method uses the concepts of simple, general, and complete graphs, where these last types of graphs are used to encode the core electrons. The hydrogen atoms have been encoded by the aid of a graph-theoretical perturbation parameter, which contributes to the definition of the valence delta, delta(v), a key parameter in molecular connectivity studies. The model of the twelve properties done with a stepwise search algorithm is always satisfactory, and it allows to check the influence of the hydrogen content of the solvent molecules on the choice of the type of descriptor. A similar argument holds for the influence of the halogen atoms on the type of core electron representation. In some cases the molar mass, and in a minor way, special "ad hoc" parameters have been used to improve the model. A very good model of the surface tension could be obtained by the aid of five experimental parameters. A mixed model method based on experimental parameters plus molecular connectivity indices achieved, instead, to consistently improve the model quality of five properties. To underline is the importance of the boiling point temperatures as descriptors in these last two model methodologies. Copyright 2009 Wiley Periodicals, Inc.

  15. Modeling criterion shifts and target checking in prospective memory monitoring.

    PubMed

    Horn, Sebastian S; Bayen, Ute J

    2015-01-01

    Event-based prospective memory (PM) involves remembering to perform intended actions after a delay. An important theoretical issue is whether and how people monitor the environment to execute an intended action when a target event occurs. Performing a PM task often increases the latencies in ongoing tasks. However, little is known about the reasons for this cost effect. This study uses diffusion model analysis to decompose monitoring processes in the PM paradigm. Across 4 experiments, performing a PM task increased latencies in an ongoing lexical decision task. A large portion of this effect was explained by consistent increases in boundary separation; additional increases in nondecision time emerged in a nonfocal PM task and explained variance in PM performance (Experiment 1), likely reflecting a target-checking strategy before and after the ongoing decision (Experiment 2). However, we found that possible target-checking strategies may depend on task characteristics. That is, instructional emphasis on the importance of ongoing decisions (Experiment 3) or the use of focal targets (Experiment 4) eliminated the contribution of nondecision time to the cost of PM, but left participants in a mode of increased cautiousness. The modeling thus sheds new light on the cost effect seen in many PM studies and suggests that people approach ongoing activities more cautiously when they need to remember an intended action. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  16. The Implicit Construction of "Children at Risk": On the Dynamics of Practice and Programme in Development Screenings in Early Childhood

    ERIC Educational Resources Information Center

    Bollig, Sabine; Kelle, Helga

    2013-01-01

    This article presents findings from an ethnographic study on preventive paediatric check-ups in Germany. In accordance with system-theoretical and governmentality approaches (referencing Foucault), preventive check-ups are conceptualised as fields where risk concepts related to children's development are applied, produced and reworked. In order to…

  17. Scoping review of adherence promotion theories in pelvic floor muscle training - 2011 ICS state-of-the-science seminar research paper i of iv.

    PubMed

    McClurg, Doreen; Frawley, Helena; Hay-Smith, Jean; Dean, Sarah; Chen, Shu-Yueh; Chiarelli, Pauline; Mair, Frances; Dumoulin, Chantale

    2015-09-01

    This paper, the first of four emanating from the International Continence Society's 2011 State-of-the-Science Seminar on pelvic-floor-muscle training (PFMT) adherence, aimed to summarize the literature on theoretical models to promote PFMT adherence, as identified in the research, or suggested by the seminar's expert panel, and recommends future directions for clinical practice and research. Existing literature on theories of health behavior were identified through a conventional subject search of electronic databases, reference-list checking, and input from the expert panel. A core eligibility criterion was that the study included a theoretical model to underpin adherence strategies used in an intervention to promote PFM training/exercise. A brief critique of 12 theoretical models/theories is provided and, were appropriate, their use in PFMT adherence strategies identified or examples of possible uses in future studies outlined. A better theoretical-based understanding of interventions to promote PFMT adherence through changes in health behaviors is required. The results of this scoping review and expert opinions identified several promising models. Future research should explicitly map the theories behind interventions that are thought to improve adherence in various populations (e.g., perinatal women to prevent or lessen urinary incontinence). In addition, identified behavioral theories applied to PFMT require a process whereby their impact can be evaluated. © 2015 Wiley Periodicals, Inc.

  18. Thermal Aspects of Lithium Ion Cells

    NASA Technical Reports Server (NTRS)

    Frank, H.; Shakkottai, P.; Bugga, R.; Smart, M.; Huang, C. K.; Timmerman, P.; Surampudi, S.

    2000-01-01

    This viewgraph presentation outlines the development of a thermal model of Li-ion cells in terms of heat generation, thermal mass, and thermal resistance. Intended for incorporation into battery model. The approach was to estimate heat generation: with semi-theoretical model, and then to check accuracy with efficiency measurements. Another objective was to compute thermal mass from component weights and specific heats, and to compute the thermal resistance from component dimensions and conductivities. Two lithium batteries are compared, the Cylindrical lithium battery, and the prismatic lithium cell. It reviews methodology for estimating the heat generation rate. Graphs of the Open-circuit curves of the cells and the heat evolution during discharge are given.

  19. Barriers and enablers to delivery of the Healthy Kids Check: an analysis informed by the Theoretical Domains Framework and COM-B model

    PubMed Central

    2014-01-01

    Background More than a fifth of Australian children arrive at school developmentally vulnerable. To counteract this, the Healthy Kids Check (HKC), a one-off health assessment aimed at preschool children, was introduced in 2008 into Australian general practice. Delivery of services has, however, remained low. The Theoretical Domains Framework, which provides a method to understand behaviours theoretically, can be condensed into three core components: capability, opportunity and motivation, and the COM-B model. Utilising this system, this study aimed to determine the barriers and enablers to delivery of the HKC, to inform the design of an intervention to promote provision of HKC services in Australian general practice. Methods Data from 6 focus group discussions with 40 practitioners from general practices in socio-culturally diverse areas of Melbourne, Victoria, were analysed using thematic analysis. Results Many practitioners expressed uncertainty regarding their capabilities and the practicalities of delivering HKCs, but in some cases HKCs had acted as a catalyst for professional development. Key connections between immunisation services and delivery of HKCs prompted practices to have systems of recall and reminder in place. Standardisation of methods for developmental assessment and streamlined referral pathways affected practitioners’ confidence and motivation to perform HKCs. Conclusion Application of a systematic framework effectively demonstrated how a number of behaviours could be targeted to increase delivery of HKCs. Interventions need to target practice systems, the support of office staff and referral options, as well as practitioners’ training. Many behavioural changes could be applied through a single intervention programme delivered by the primary healthcare organisations charged with local healthcare needs (Medicare Locals) providing vital links between general practice, community and the health of young children. PMID:24886520

  20. Verification of a Byzantine-Fault-Tolerant Self-stabilizing Protocol for Clock Synchronization

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2008-01-01

    This paper presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system except for the presence of sufficient good nodes, thus making the weakest possible assumptions and producing the strongest results. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV). The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space.

  1. Extensions to the visual predictive check to facilitate model performance evaluation.

    PubMed

    Post, Teun M; Freijer, Jan I; Ploeger, Bart A; Danhof, Meindert

    2008-04-01

    The Visual Predictive Check (VPC) is a valuable and supportive instrument for evaluating model performance. However in its most commonly applied form, the method largely depends on a subjective comparison of the distribution of the simulated data with the observed data, without explicitly quantifying and relating the information in both. In recent adaptations to the VPC this drawback is taken into consideration by presenting the observed and predicted data as percentiles. In addition, in some of these adaptations the uncertainty in the predictions is represented visually. However, it is not assessed whether the expected random distribution of the observations around the predicted median trend is realised in relation to the number of observations. Moreover the influence of and the information residing in missing data at each time point is not taken into consideration. Therefore, in this investigation the VPC is extended with two methods to support a less subjective and thereby more adequate evaluation of model performance: (i) the Quantified Visual Predictive Check (QVPC) and (ii) the Bootstrap Visual Predictive Check (BVPC). The QVPC presents the distribution of the observations as a percentage, thus regardless the density of the data, above and below the predicted median at each time point, while also visualising the percentage of unavailable data. The BVPC weighs the predicted median against the 5th, 50th and 95th percentiles resulting from a bootstrap of the observed data median at each time point, while accounting for the number and the theoretical position of unavailable data. The proposed extensions to the VPC are illustrated by a pharmacokinetic simulation example and applied to a pharmacodynamic disease progression example.

  2. Formation and stability of twisted ribbons in mixtures of rod-like fd-virus and non-adsorbing polymer

    NASA Astrophysics Data System (ADS)

    Dogic, Z.; Didonna, B.; Bryning, M.; Lubensky, T. C.; Yodh, A. G.; Janmey, P. A.

    2003-03-01

    We are investigating the behavior of mixtures of monodisperse fd-virus rods and non-adsorbing polymer. We observe the formation of isolated smectic disks. The single smectic disk is of a monolayer of aligned rods while its thickness equal to the length of a single rod. As disks coalesce they undergo shape transformations from flat structures to elongated twisted ribbons. A theoretical model is formulated wherein the chirality of the molecule favors the formation of the elongated ribbon structure while the line tension favors formation of untwisted disks. To check the validity of the theoretical model line tension and twist constants are experimentally measured. The line tension is deduced from thermal fluctuations of the interface. The twist constant is determined by unwinding the twisted ribbons using optical tweezers. This work is partially supported by NSF grants DMR-0203378, the PENN MRSEC, DMR-0079909, and NASA grant NAG8-2172.

  3. A theory-informed approach to developing visually mediated interventions to change behaviour using an asthma and physical activity intervention exemplar.

    PubMed

    Murray, Jennifer; Williams, Brian; Hoskins, Gaylor; Skar, Silje; McGhee, John; Treweek, Shaun; Sniehotta, Falko F; Sheikh, Aziz; Brown, Gordon; Hagen, Suzanne; Cameron, Linda; Jones, Claire; Gauld, Dylan

    2016-01-01

    Visualisation techniques are used in a range of healthcare interventions. However, these frequently lack a coherent rationale or clear theoretical basis. This lack of definition and explicit targeting of the underlying mechanisms may impede the success of and evaluation of the intervention. We describe the theoretical development, deployment, and pilot evaluation, of a complex visually mediated behavioural intervention. The exemplar intervention focused on increasing physical activity among young people with asthma. We employed an explicit five-stage development model, which was actively supported by a consultative user group. The developmental stages involved establishing the theoretical basis, establishing a narrative structure, visual rendering, checking interpretation, and pilot testing. We conducted in-depth interviews and focus groups during early development and checking, followed by an online experiment for pilot testing. A total of 91 individuals, including young people with asthma, parents, teachers, and health professionals, were involved in development and testing. Our final intervention consisted of two components: (1) an interactive 3D computer animation to create intentions and (2) an action plan and volitional help sheet to promote the translation of intentions to behaviour. Theory was mediated throughout by visual and audio forms. The intervention was regarded as highly acceptable, engaging, and meaningful by all stakeholders. The perceived impact on asthma understanding and intentions was reported positively, with most individuals saying that the 3D computer animation had either clarified a range of issues or made them more real. Our five-stage model underpinned by extensive consultation worked well and is presented as a framework to support explicit decision-making for others developing theory informed visually mediated interventions. We have demonstrated the ability to develop theory-based visually mediated behavioural interventions. However, attention needs to be paid to the potential ambiguity associated with images and thus the concept of visual literacy among patients. Our revised model may be helpful as a guide to aid development, acceptability, and ultimately effectiveness.

  4. Model Checking A Self-Stabilizing Synchronization Protocol for Arbitrary Digraphs

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2012-01-01

    This report presents the mechanical verification of a self-stabilizing distributed clock synchronization protocol for arbitrary digraphs in the absence of faults. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. The system under study is an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. This protocol deterministically converges within a time bound that is a linear function of the self-stabilization period. A bounded model of the protocol is verified using the Symbolic Model Verifier (SMV) for a subset of digraphs. Modeling challenges of the protocol and the system are addressed. The model checking effort is focused on verifying correctness of the bounded model of the protocol as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period.

  5. Research on sub-surface damage and its stress deformation in the process of large aperture and high diameter-to-thickness ratio TMT M3MP

    NASA Astrophysics Data System (ADS)

    Hu, Hai-xiang; Qi, Erhui; Cole, Glen; Hu, Hai-fei; Luo, Xiao; Zhang, Xue-jun

    2016-10-01

    Large flat mirrors play important roles in large aperture telescopes. However, they also introduce unpredictable problems. The surface errors created during manufacturing, testing, and supporting are all combined during measurement, thus making understanding difficult for diagnosis and treatment. Examining a high diameter-to-thickness ratio flat mirror, TMT M3MP, and its unexpected deformation during processing, we proposed a strain model of subsurface damage to explain the observed phenomenon. We designed a set of experiment, and checked the validity of our diagnosis. On that basis, we theoretical predicted the trend of this strain and its scale effect on Zerodur®, and checked the validity on another piece experimentally. This work guided the grinding-polishing process of M3MP, and will be used as reference for M3M processing as well.

  6. Impedance Analysis of Ion Transport Through Supported Lipid Membranes Doped with Ionophores: A New Kinetic Approach

    PubMed Central

    Alvarez, P. E.; Vallejo, A. E.

    2008-01-01

    Kinetics of facilitated ion transport through planar bilayer membranes are normally analyzed by electrical conductance methods. The additional use of electrical relaxation techniques, such as voltage jump, is necessary to evaluate individual rate constants. Although electrochemical impedance spectroscopy is recognized as the most powerful of the available electric relaxation techniques, it has rarely been used in connection with these kinetic studies. According to the new approach presented in this work, three steps were followed. First, a kinetic model was proposed that has the distinct quality of being general, i.e., it properly describes both carrier and channel mechanisms of ion transport. Second, the state equations for steady-state and for impedance experiments were derived, exhibiting the input–output representation pertaining to the model’s structure. With the application of a method based on the similarity transformation approach, it was possible to check that the proposed mechanism is distinguishable, i.e., no other model with a different structure exhibits the same input–output behavior for any input as the original. Additionally, the method allowed us to check whether the proposed model is globally identifiable (i.e., whether there is a single set of fit parameters for the model) when analyzed in terms of its impedance response. Thus, our model does not represent a theoretical interpretation of the experimental impedance but rather constitutes the prerequisite to select this type of experiment in order to obtain optimal kinetic identification of the system. Finally, impedance measurements were performed and the results were fitted to the proposed theoretical model in order to obtain the kinetic parameters of the system. The successful application of this approach is exemplified with results obtained for valinomycin–K+ in lipid bilayers supported onto gold substrates, i.e., an arrangement capable of emulating biological membranes. PMID:19669528

  7. JAVA PathFinder

    NASA Technical Reports Server (NTRS)

    Mehhtz, Peter

    2005-01-01

    JPF is an explicit state software model checker for Java bytecode. Today, JPF is a swiss army knife for all sort of runtime based verification purposes. This basically means JPF is a Java virtual machine that executes your program not just once (like a normal VM), but theoretically in all possible ways, checking for property violations like deadlocks or unhandled exceptions along all potential execution paths. If it finds an error, JPF reports the whole execution that leads to it. Unlike a normal debugger, JPF keeps track of every step how it got to the defect.

  8. High-speed holocinematographic velocimeter for studying turbulent flow control physics

    NASA Technical Reports Server (NTRS)

    Weinstein, L. M.; Beeler, G. B.; Lindemann, A. M.

    1985-01-01

    Use of a dual view, high speed, holographic movie technique is examined for studying turbulent flow control physics. This approach, which eliminates some of the limitations of previous holographic techniques, is termed a holocinematographic velocimeter (HCV). The data from this system can be used to check theoretical turbulence modeling and numerical simulations, visualize and measure coherent structures in 'non-simple' turbulent flows, and examine the mechanisms operative in various turbulent control/drag reduction concepts. This system shows promise for giving the most complete experimental characterization of turbulent flows yet available.

  9. Efficiency of the strong satisfiability checking procedure for reactive system specifications

    NASA Astrophysics Data System (ADS)

    Shimakawa, Masaya; Hagihara, Shigeki; Yonezaki, Naoki

    2018-04-01

    Reactive systems are those that interact with their environment. To develop reactive systems without defects, it is important to describe behavior specifications in a formal language, such as linear temporal logic, and to verify the specification. Specifically, it is important to check whether specifications satisfy the property called realizability. In previous studies, we have proposed the concept of strong satisfiability as a necessary condition for realizability. Although this property of reactive system specifications is a necessary condition, many practical unrealizable specifications are also strongly unsatisfiable. Moreover, we have previously shown the theoretical complexity of the strong satisfiability problem. In the current study, we investigate the practical efficiency of the strong satisfiability checking procedure and demonstrate that strong satisfiability can be checked more efficiently than realizability.

  10. Investigation of numerical simulation on all-optical flip-flop stability maps of 1550nm vertical-cavity surface-emitting laser

    NASA Astrophysics Data System (ADS)

    Li, Jun; Xia, Qing; Wang, Xiaofa

    2017-10-01

    Based on the extended spin-flip model, the all-optical flip-flop stability maps of the 1550nm vertical-cavity surface-emitting laser have been studied. Theoretical results show that excellent agreement is found between theoretical and the reported experimental results in polarization switching point current which is equal to 1.95 times threshold. Furthermore, the polarization bistable region is wide which is from 1.05 to 1.95 times threshold. A new method is presented that uses power difference between two linear polarization modes as the judging criterion of trigger degree and stability maps of all-optical flip-flop operation under different injection parameters are obtained. By alternately injecting set and reset pulse with appropriate parameters, the mutual conversion switching between two polarization modes is realized, the feasibility of all-optical flip-flop operation is checked theoretically. The results show certain guiding significance on the experimental study on all optical buffer technology.

  11. Testing the Grossman model of medical spending determinants with macroeconomic panel data.

    PubMed

    Hartwig, Jochen; Sturm, Jan-Egbert

    2018-02-16

    Michael Grossman's human capital model of the demand for health has been argued to be one of the major achievements in theoretical health economics. Attempts to test this model empirically have been sparse, however, and with mixed results. These attempts so far relied on using-mostly cross-sectional-micro data from household surveys. For the first time in the literature, we bring in macroeconomic panel data for 29 OECD countries over the period 1970-2010 to test the model. To check the robustness of the results for the determinants of medical spending identified by the model, we include additional covariates in an extreme bounds analysis (EBA) framework. The preferred model specifications (including the robust covariates) do not lend much empirical support to the Grossman model. This is in line with the mixed results of earlier studies.

  12. An Emerging Theoretical Model of Music Therapy Student Development.

    PubMed

    Dvorak, Abbey L; Hernandez-Ruiz, Eugenia; Jang, Sekyung; Kim, Borin; Joseph, Megan; Wells, Kori E

    2017-07-01

    Music therapy students negotiate a complex relationship with music and its use in clinical work throughout their education and training. This distinct, pervasive, and evolving relationship suggests a developmental process unique to music therapy. The purpose of this grounded theory study was to create a theoretical model of music therapy students' developmental process, beginning with a study within one large Midwestern university. Participants (N = 15) were music therapy students who completed one 60-minute intensive interview, followed by a 20-minute member check meeting. Recorded interviews were transcribed, analyzed, and coded using open and axial coding. The theoretical model that emerged was a six-step sequential developmental progression that included the following themes: (a) Personal Connection, (b) Turning Point, (c) Adjusting Relationship with Music, (d) Growth and Development, (e) Evolution, and (f) Empowerment. The first three steps are linear; development continues in a cyclical process among the last three steps. As the cycle continues, music therapy students continue to grow and develop their skills, leading to increased empowerment, and more specifically, increased self-efficacy and competence. Further exploration of the model is needed to inform educators' and other key stakeholders' understanding of student needs and concerns as they progress through music therapy degree programs. © the American Music Therapy Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  13. Development and initial validation of the Impression Motivation in Sport Questionnaire-Team.

    PubMed

    Payne, Simon Mark; Hudson, Joanne; Akehurst, Sally; Ntoumanis, Nikos

    2013-06-01

    Impression motivation is an important individual difference variable that has been under-researched in sport psychology, partly due to having no appropriate measure. This study was conducted to design a measure of impression motivation in team-sport athletes. Construct validity checks decreased the initial pool of items, factor analysis (n = 310) revealed the structure of the newly developed scale, and exploratory structural equation modeling procedures (n = 406) resulted in a modified scale that retained theoretical integrity and psychometric parsimony. This process produced a 15-item, 4-factor model; the Impression Motivation in Sport Questionnaire-Team (IMSQ-T) is forwarded as a valid measure of the respondent's dispositional strength of motivation to use self-presentation in striving for four distinct interpersonal objectives: self-development, social identity development, avoidance of negative outcomes, and avoidance of damaging impressions. The availability of this measure has contributed to theoretical development, will facilitate research, and offers a tool for use in applied settings.

  14. Analyzing the causation of a railway accident based on a complex network

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  15. Comparative study: TQ and Lean Production ownership models in health services

    PubMed Central

    Eiro, Natalia Yuri; Torres-Junior, Alvair Silveira

    2015-01-01

    Objective: compare the application of Total Quality (TQ) models used in processes of a health service, cases of lean healthcare and literature from another institution that has also applied this model. Method: this is a qualitative research that was conducted through a descriptive case study. Results: through critical analysis of the institutions studied it was possible to make a comparison between the traditional quality approach checked in one case and the theoretical and practice lean production approach used in another case and the specifications are described below. Conclusion: the research identified that the lean model was better suited for people that work systemically and generate the flow. It also pointed towards some potential challenges in the introduction and implementation of lean methods in health. PMID:26487134

  16. Addressing Dynamic Issues of Program Model Checking

    NASA Technical Reports Server (NTRS)

    Lerda, Flavio; Visser, Willem

    2001-01-01

    Model checking real programs has recently become an active research area. Programs however exhibit two characteristics that make model checking difficult: the complexity of their state and the dynamic nature of many programs. Here we address both these issues within the context of the Java PathFinder (JPF) model checker. Firstly, we will show how the state of a Java program can be encoded efficiently and how this encoding can be exploited to improve model checking. Next we show how to use symmetry reductions to alleviate some of the problems introduced by the dynamic nature of Java programs. Lastly, we show how distributed model checking of a dynamic program can be achieved, and furthermore, how dynamic partitions of the state space can improve model checking. We support all our findings with results from applying these techniques within the JPF model checker.

  17. Cosmic-Ray Lithium Production at the Nova Eruptions Followed by a Type Ia Supernova

    NASA Astrophysics Data System (ADS)

    Kawanaka, Norita; Yanagita, Shohei

    2018-01-01

    Recent measurements of cosmic-ray (CR) light nuclei by AMS-02 have shown that there is an unexpected component of CR lithium whose spectral index is harder than that expected from the secondary production scenario. We propose the nearby type Ia supernova following a nova eruption as the origin of lithium nuclei in the CRs. By fitting the data of CR protons, helium, and lithium fluxes provided by AMS-02 with our theoretical model we show that this scenario is consistent with the observations. The observational tests that can check our hypothesis are briefly discussed.

  18. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  19. Verifying Multi-Agent Systems via Unbounded Model Checking

    NASA Technical Reports Server (NTRS)

    Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.

    2004-01-01

    We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems

  20. Using Latent Class Analysis to Model Temperament Types.

    PubMed

    Loken, Eric

    2004-10-01

    Mixture models are appropriate for data that arise from a set of qualitatively different subpopulations. In this study, latent class analysis was applied to observational data from a laboratory assessment of infant temperament at four months of age. The EM algorithm was used to fit the models, and the Bayesian method of posterior predictive checks was used for model selection. Results show at least three types of infant temperament, with patterns consistent with those identified by previous researchers who classified the infants using a theoretically based system. Multiple imputation of group memberships is proposed as an alternative to assigning subjects to the latent class with maximum posterior probability in order to reflect variance due to uncertainty in the parameter estimation. Latent class membership at four months of age predicted longitudinal outcomes at four years of age. The example illustrates issues relevant to all mixture models, including estimation, multi-modality, model selection, and comparisons based on the latent group indicators.

  1. Toward improved design of check dam systems: A case study in the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua

    2018-04-01

    Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.

  2. Integral equation and thermodynamic perturbation theory for a two-dimensional model of dimerising fluid

    PubMed Central

    Urbic, Tomaz

    2016-01-01

    In this paper we applied an analytical theory for the two dimensional dimerising fluid. We applied Wertheims thermodynamic perturbation theory (TPT) and integral equation theory (IET) for associative liquids to the dimerising model with arbitrary position of dimerising points from center of the particles. The theory was used to study thermodynamical and structural properties. To check the accuracy of the theories we compared theoretical results with corresponding results obtained by Monte Carlo computer simulations. The theories are accurate for the different positions of patches of the model at all values of the temperature and density studied. IET correctly predicts the pair correlation function of the model. Both TPT and IET are in good agreement with the Monte Carlo values of the energy, pressure, chemical potential, compressibility and ratios of free and bonded particles. PMID:28529396

  3. Learning dynamics by theoretical tools of game theory. Comment on "Move me, astonish me...delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates" by M. Pelowski et al.

    NASA Astrophysics Data System (ADS)

    Burini, Diletta; De Lillo, Silvana

    2017-07-01

    The VIMAP model presented in the survey [5] aims at analyzing the processes that can occur in the human perception in the front of an artwork. Such a model combines the bottom-up (artwork derived) processes with the top-down mechanisms which describe how individuals adapt or change their own art processing experience. The cognitive flow consists of seven stages connected to five outcomes, which account for all the main ways of responding to art. Moreover this model can also identify the specific regions of the brain that are posited to be main centers of the processes that may coincide with the proposed cognitive checks.

  4. Implementing Model-Check for Employee and Management Satisfaction

    NASA Technical Reports Server (NTRS)

    Jones, Corey; LaPha, Steven

    2013-01-01

    This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.

  5. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  6. Finding Feasible Abstract Counter-Examples

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  7. Coupled Kardar-Parisi-Zhang Equations in One Dimension

    NASA Astrophysics Data System (ADS)

    Ferrari, Patrik L.; Sasamoto, Tomohiro; Spohn, Herbert

    2013-11-01

    Over the past years our understanding of the scaling properties of the solutions to the one-dimensional KPZ equation has advanced considerably, both theoretically and experimentally. In our contribution we export these insights to the case of coupled KPZ equations in one dimension. We establish equivalence with nonlinear fluctuating hydrodynamics for multi-component driven stochastic lattice gases. To check the predictions of the theory, we perform Monte Carlo simulations of the two-component AHR model. Its steady state is computed using the matrix product ansatz. Thereby all coefficients appearing in the coupled KPZ equations are deduced from the microscopic model. Time correlations in the steady state are simulated and we confirm not only the scaling exponent, but also the scaling function and the non-universal coefficients.

  8. Controlling the light shift of the CPT resonance by modulation technique

    NASA Astrophysics Data System (ADS)

    Tsygankov, E. A.; Petropavlovsky, S. V.; Vaskovskaya, M. I.; Zibrov, S. A.; Velichansky, V. L.; Yakovlev, V. P.

    2017-12-01

    Motivated by recent developments in atomic frequency standards employing the effect of coherent population trapping (CPT), we propose a theoretical framework for the frequency modulation spectroscopy of the CPT resonances. Under realistic assumptions we provide simple yet non-trivial analytical formulae for the major spectroscopic signals such as the CPT resonance line and the in-phase/quadrature responses. We discuss the influence of the light shift and, in particular, derive a simple expression for the displacement of the resonance as a function of modulation index. The performance of the model is checked against numerical simulations, the agreement is good to perfect. The obtained results can be used in more general models accounting for light absorption in the thick optical medium.

  9. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... in different states or check processing regions)]. If you make the deposit in person to one of our...] Substitute Checks and Your Rights What Is a Substitute Check? To make check processing faster, federal law...

  10. Application of conditional moment tests to model checking for generalized linear models.

    PubMed

    Pan, Wei

    2002-06-01

    Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.

  11. Take the Reins on Model Quality with ModelCHECK and Gatekeeper

    NASA Technical Reports Server (NTRS)

    Jones, Corey

    2012-01-01

    Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.

  12. No breakdown of the radiatively driven wind theory in low-metallicity environments

    NASA Astrophysics Data System (ADS)

    Bouret, J.-C.; Lanz, T.; Hillier, D. J.; Martins, F.; Marcolino, W. L. F.; Depagne, E.

    2015-05-01

    We present a spectroscopic analysis of Hubble Space Telescope/Cosmic Origins Spectrograph observations of three massive stars in the low metallicity dwarf galaxies IC 1613 and WLM. These stars, were previously observed with Very Large Telescope (VLT)/X-shooter by Tramper et al., who claimed that their mass-loss rates are higher than expected from theoretical predictions for the underlying metallicity. A comparison of the far ultraviolet (FUV) spectra with those of stars of similar spectral types/luminosity classes in the Galaxy, and the Magellanic Clouds provides a direct, model-independent check of the mass-loss-metallicity relation. Then, a quantitative spectroscopic analysis is carried out using the non-LTE (NLTE) stellar atmosphere code CMFGEN. We derive the photospheric and wind characteristics, benefiting from a much better sensitivity of the FUV lines to wind properties than Hα. Iron and CNO abundances are measured, providing an independent check of the stellar metallicity. The spectroscopic analysis indicates that Z/Z⊙ = 1/5, similar to a Small Magellanic Cloud-type environment, and higher than usually quoted for IC 1613 and WLM. The mass-loss rates are smaller than the empirical ones by Tramper et al., and those predicted by the widely used theoretical recipe by Vink et al. On the other hand, we show that the empirical, FUV-based, mass-loss rates are in good agreement with those derived from mass fluxes computed by Lucy. We do not concur with Tramper et al. that there is a breakdown in the mass-loss-metallicity relation.

  13. Effect of wave function on the proton induced L XRP cross sections for 62Sm and 74W

    NASA Astrophysics Data System (ADS)

    Shehla, Kaur, Rajnish; Kumar, Anil; Puri, Sanjiv

    2015-08-01

    The Lk(k= 1, α, β, γ) X-ray production cross sections have been calculated for 74W and 62Sm at different incident proton energies ranging 1-5 MeV using theoretical data sets of different physical parameters, namely, the Li(i=1-3) sub-shell X-ray emission rates based on the Dirac-Fork (DF) model, the fluorescence and Coster Kronig yields based on the Dirac- Hartree-Slater (DHS) model and two sets the proton ionization cross sections based on the DHS model and the ECPSSR in order to assess the influence of the wave function on the XRP cross sections. The calculated cross sections have been compared with the measured cross sections reported in the recent compilation to check the reliability of the calculated values.

  14. Model checking for linear temporal logic: An efficient implementation

    NASA Technical Reports Server (NTRS)

    Sherman, Rivi; Pnueli, Amir

    1990-01-01

    This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.

  15. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  16. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy Disclosure and Notices C Appendix C to Part 229 Banks and... OF FUNDS AND COLLECTION OF CHECKS (REGULATION CC) Pt. 229, App. C Appendix C to Part 229—Model...

  17. Chemometric Methods and Theoretical Molecular Descriptors in Predictive QSAR Modeling of the Environmental Behavior of Organic Pollutants

    NASA Astrophysics Data System (ADS)

    Gramatica, Paola

    This chapter surveys the QSAR modeling approaches (developed by the author's research group) for the validated prediction of environmental properties of organic pollutants. Various chemometric methods, based on different theoretical molecular descriptors, have been applied: explorative techniques (such as PCA for ranking, SOM for similarity analysis), modeling approaches by multiple-linear regression (MLR, in particular OLS), and classification methods (mainly k-NN, CART, CP-ANN). The focus of this review is on the main topics of environmental chemistry and ecotoxicology, related to the physico-chemical properties, the reactivity, and biological activity of chemicals of high environmental concern. Thus, the review deals with atmospheric degradation reactions of VOCs by tropospheric oxidants, persistence and long-range transport of POPs, sorption behavior of pesticides (Koc and leaching), bioconcentration, toxicity (acute aquatic toxicity, mutagenicity of PAHs, estrogen binding activity for endocrine disruptors compounds (EDCs)), and finally persistent bioaccumulative and toxic (PBT) behavior for the screening and prioritization of organic pollutants. Common to all the proposed models is the attention paid to model validation for predictive ability (not only internal, but also external for chemicals not participating in the model development) and checking of the chemical domain of applicability. Adherence to such a policy, requested also by the OECD principles, ensures the production of reliable predicted data, useful also in the new European regulation of chemicals, REACH.

  18. Ti:sapphire - A theoretical assessment for its spectroscopy

    NASA Astrophysics Data System (ADS)

    Da Silva, A.; Boschetto, D.; Rax, J. M.; Chériaux, G.

    2017-03-01

    This article tries to theoretically compute the stimulated emission cross-sections when we know the oscillator strength of a broad material class (dielectric crystals hosting metal-transition impurity atoms). We apply the present approach to Ti:sapphire and check it by computing some emission cross-section curves for both π and σ polarizations. We also set a relationship between oscillator strength and radiative lifetime. Such an approach will allow future parametric studies for Ti:sapphire spectroscopic properties.

  19. Theoretical and practical knowledge of Nursing professionals on indirect blood pressure measurement at a coronary care unit

    PubMed Central

    Machado, Juliana Pereira; Veiga, Eugenia Velludo; Ferreira, Paulo Alexandre Camargo; Martins, José Carlos Amado; Daniel, Ana Carolina Queiroz Godoy; Oliveira, Amanda dos Santos; da Silva, Patrícia Costa dos Santos

    2014-01-01

    Objective To determine and to analyze the theoretical and practical knowledge of Nursing professionals on indirect blood pressure measurement. Methods This cross-sectional study included 31 professionals of a coronary care unit (86% of the Nursing staff in the unit). Of these, 38.7% of professionals were nurses and 61.3% nurse technicians. A validated questionnaire was used to theoretical evaluation and for practice assessment the auscultatory technique was applied in a simulation environment, under a non-participant observation. Results To the theoretical knowledge of the stages of preparation of patient and environment, 12.9% mentioned 5-minute of rest, 48.4% checked calibration, and 29.0% chose adequate cuff width. A total of 64.5% of professionals avoided rounding values, and 22.6% mentioned the 6-month deadline period for the equipment calibration. On average, in practice assessment, 65% of the steps were followed. Lacks in knowledge were primary concerning lack of checking the device calibration and stethoscope, measurement of arm circumference to choose the cuff size, and the record of arm used in blood pressure measurement. Conclusion Knowledge was poor and had disparities between theory and practice with evidence of steps taken without proper awareness and lack of consideration of important knowledge during implementation of blood pressure measurement. Educational and operational interventions should be applied systematically with institutional involvement to ensure safe care with reliable values. PMID:25295455

  20. Regimes of stability and scaling relations for the removal time in the asteroid belt: a simple kinetic model and numerical tests

    NASA Astrophysics Data System (ADS)

    Cubrovic, Mihailo

    2005-02-01

    We report on our theoretical and numerical results concerning the transport mechanisms in the asteroid belt. We first derive a simple kinetic model of chaotic diffusion and show how it gives rise to some simple correlations (but not laws) between the removal time (the time for an asteroid to experience a qualitative change of dynamical behavior and enter a wide chaotic zone) and the Lyapunov time. The correlations are shown to arise in two different regimes, characterized by exponential and power-law scalings. We also show how is the so-called “stable chaos” (exponential regime) related to anomalous diffusion. Finally, we check our results numerically and discuss their possible applications in analyzing the motion of particular asteroids.

  1. Maximum drag reduction asymptotes and the cross-over to the Newtonian plug

    NASA Astrophysics Data System (ADS)

    Benzi, R.; de Angelis, E.; L'Vov, V. S.; Procaccia, I.; Tiberkevich, V.

    2006-03-01

    We employ the full FENE-P model of the hydrodynamics of a dilute polymer solution to derive a theoretical approach to drag reduction in wall-bounded turbulence. We recapture the results of a recent simplified theory which derived the universal maximum drag reduction (MDR) asymptote, and complement that theory with a discussion of the cross-over from the MDR to the Newtonian plug when the drag reduction saturates. The FENE-P model gives rise to a rather complex theory due to the interaction of the velocity field with the polymeric conformation tensor, making analytic estimates quite taxing. To overcome this we develop the theory in a computer-assisted manner, checking at each point the analytic estimates by direct numerical simulations (DNS) of viscoelastic turbulence in a channel.

  2. Scaling with System Size of the Lyapunov Exponents for the Hamiltonian Mean Field Model

    NASA Astrophysics Data System (ADS)

    Manos, Thanos; Ruffo, Stefano

    2011-12-01

    The Hamiltonian Mean Field model is a prototype for systems with long-range interactions. It describes the motion of N particles moving on a ring, coupled with an infinite-range potential. The model has a second-order phase transition at the energy density Uc =3/4 and its dynamics is exactly described by the Vlasov equation in the N→∞ limit. Its chaotic properties have been investigated in the past, but the determination of the scaling with N of the Lyapunov Spectrum (LS) of the model remains a challenging open problem. Here we show that the N -1/3 scaling of the Maximal Lyapunov Exponent (MLE), found in previous numerical and analytical studies, extends to the full LS; scaling is "precocious" for the LS, meaning that it becomes manifest for a much smaller number of particles than the one needed to check the scaling for the MLE. Besides that, the N -1/3 scaling appears to be valid not only for U>Uc , as suggested by theoretical approaches based on a random matrix approximation, but also below a threshold energy Ut ≈0.2. Using a recently proposed method (GALI) devised to rapidly check the chaotic or regular nature of an orbit, we find that Ut is also the energy at which a sharp transition from weak to strong chaos is present in the phase-space of the model. Around this energy the phase of the vector order parameter of the model becomes strongly time dependent, inducing a significant untrapping of particles from a nonlinear resonance.

  3. The influence of social anxiety on the body checking behaviors of female college students.

    PubMed

    White, Emily K; Warren, Cortney S

    2014-09-01

    Social anxiety and eating pathology frequently co-occur. However, there is limited research examining the relationship between anxiety and body checking, aside from one study in which social physique anxiety partially mediated the relationship between body checking cognitions and body checking behavior (Haase, Mountford, & Waller, 2007). In an independent sample of 567 college women, we tested the fit of Haase and colleagues' foundational model but did not find evidence of mediation. Thus we tested the fit of an expanded path model that included eating pathology and clinical impairment. In the best-fitting path model (CFI=.991; RMSEA=.083) eating pathology and social physique anxiety positively predicted body checking, and body checking positively predicted clinical impairment. Therefore, women who endorse social physique anxiety may be more likely to engage in body checking behaviors and experience impaired psychosocial functioning. Published by Elsevier Ltd.

  4. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  5. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  6. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  7. Determination of MLC model parameters for Monaco using commercial diode arrays.

    PubMed

    Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian

    2016-07-08

    Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors

  8. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  9. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  10. UTP and Temporal Logic Model Checking

    NASA Astrophysics Data System (ADS)

    Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo

    In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures

  11. 75 FR 28480 - Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... pressurise the hydraulic reservoirs, due to leakage of the Crissair reservoir air pressurisation check valves. * * * The leakage of the check valves was caused by an incorrect spring material. The affected Crissair check valves * * * were then replaced with improved check valves P/N [part number] 2S2794-1 * * *. More...

  12. Interaction of non-radially symmetric camphor particles

    NASA Astrophysics Data System (ADS)

    Ei, Shin-Ichiro; Kitahata, Hiroyuki; Koyano, Yuki; Nagayama, Masaharu

    2018-03-01

    In this study, the interaction between two non-radially symmetric camphor particles is theoretically investigated and the equation describing the motion is derived as an ordinary differential system for the locations and the rotations. In particular, slightly modified non-radially symmetric cases from radial symmetry are extensively investigated and explicit motions are obtained. For example, it is theoretically shown that elliptically deformed camphor particles interact so as to be parallel with major axes. Such predicted motions are also checked by real experiments and numerical simulations.

  13. On the validation of seismic imaging methods: Finite frequency or ray theory?

    DOE PAGES

    Maceira, Monica; Larmat, Carene; Porritt, Robert W.; ...

    2015-01-23

    We investigate the merits of the more recently developed finite-frequency approach to tomography against the more traditional and approximate ray theoretical approach for state of the art seismic models developed for western North America. To this end, we employ the spectral element method to assess the agreement between observations on real data and measurements made on synthetic seismograms predicted by the models under consideration. We check for phase delay agreement as well as waveform cross-correlation values. Based on statistical analyses on S wave phase delay measurements, finite frequency shows an improvement over ray theory. Random sampling using cross-correlation values identifiesmore » regions where synthetic seismograms computed with ray theory and finite-frequency models differ the most. Our study suggests that finite-frequency approaches to seismic imaging exhibit measurable improvement for pronounced low-velocity anomalies such as mantle plumes.« less

  14. Life Strain, Social Control, Social Learning, and Delinquency: The Effects of Gender, Age, and Family SES Among Chinese Adolescents.

    PubMed

    Bao, Wan-Ning; Haas, Ain; Xie, Yunping

    2016-09-01

    Very few studies have examined the pathways to delinquency and causal factors for demographic subgroups of adolescents in a different culture. This article explores the effects of gender, age, and family socioeconomic status (SES) in an integrated model of strain, social control, social learning, and delinquency among a sample of Chinese adolescents. ANOVA is used to check for significant differences between categories of demographic groups on the variables in the integrated model, and the differential effects of causal factors in the theoretical path models are examined. Further tests of interaction effects are conducted to compare path coefficients between "high-risk" youths (i.e., male, mid-teen, and low family SES adolescents) and other subgroups. The findings identified similar pathways to delinquency across subgroups and clarified the salience of causal factors for male, mid-teen, and low SES adolescents in a different cultural context. © The Author(s) 2015.

  15. Effect of wave function on the proton induced L XRP cross sections for {sub 62}Sm and {sub 74}W

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shehla,; Kaur, Rajnish; Kumar, Anil

    The L{sub k}(k= 1, α, β, γ) X-ray production cross sections have been calculated for {sub 74}W and {sub 62}Sm at different incident proton energies ranging 1-5 MeV using theoretical data sets of different physical parameters, namely, the Li(i=1-3) sub-shell X-ray emission rates based on the Dirac-Fork (DF) model, the fluorescence and Coster Kronig yields based on the Dirac- Hartree-Slater (DHS) model and two sets the proton ionization cross sections based on the DHS model and the ECPSSR in order to assess the influence of the wave function on the XRP cross sections. The calculated cross sections have been compared withmore » the measured cross sections reported in the recent compilation to check the reliability of the calculated values.« less

  16. Single-polymer dynamics under constraints: scaling theory and computer experiment.

    PubMed

    Milchev, Andrey

    2011-03-16

    The relaxation, diffusion and translocation dynamics of single linear polymer chains in confinement is briefly reviewed with emphasis on the comparison between theoretical scaling predictions and observations from experiment or, most frequently, from computer simulations. Besides cylindrical, spherical and slit-like constraints, related problems such as the chain dynamics in a random medium and the translocation dynamics through a nanopore are also considered. Another particular kind of confinement is imposed by polymer adsorption on attractive surfaces or selective interfaces--a short overview of single-chain dynamics is also contained in this survey. While both theory and numerical experiments consider predominantly coarse-grained models of self-avoiding linear chain molecules with typically Rouse dynamics, we also note some recent studies which examine the impact of hydrodynamic interactions on polymer dynamics in confinement. In all of the aforementioned cases we focus mainly on the consequences of imposed geometric restrictions on single-chain dynamics and try to check our degree of understanding by assessing the agreement between theoretical predictions and observations.

  17. Theoretical Modeling of Electromagnetic Field from Electron Bunches in Periodic Wire Medium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chuchurka, S.; Benediktovitch, A.; Galyamin, S. N.

    The interaction of relativistic electrons with periodic conducting structures results in radiation via a number of mechanisms. In case of crystals one obtains parametric X-ray radiation, its frequency is determined by the distance between crystallographic planes and the direction of electron beam. If instead of a crystal one considers a periodic structure of metallic wires with the spacing of the order of mm, it is plausible to expect the emission of radiation of a similar nature (“diffraction response”) at THz frequencies. Additionally, a “long-wave” radiation will occur in this case with wavelengths much larger then structure periods. In this contribution,more » we present different theoretical approaches for describing the electromagnetic radiation field from prolonged electron bunch propagated in the lattice of metallic wires. The validity of these analytical descriptions is checked by numerical simulations. We discuss the possible applications of aforementioned structure as sources of coherent THz radiation.« less

  18. Assessment of check-dam groundwater recharge with water-balance calculations

    NASA Astrophysics Data System (ADS)

    Djuma, Hakan; Bruggeman, Adriana; Camera, Corrado; Eliades, Marinos

    2017-04-01

    Studies on the enhancement of groundwater recharge by check-dams in arid and semi-arid environments mainly focus on deriving water infiltration rates from the check-dam ponding areas. This is usually achieved by applying simple water balance models, more advanced models (e.g., two dimensional groundwater models) and field tests (e.g., infiltrometer test or soil pit tests). Recharge behind the check-dam can be affected by the built-up of sediment as a result of erosion in the upstream watershed area. This natural process can increase the uncertainty in the estimates of the recharged water volume, especially for water balance calculations. Few water balance field studies of individual check-dams have been presented in the literature and none of them presented associated uncertainties of their estimates. The objectives of this study are i) to assess the effect of a check-dam on groundwater recharge from an ephemeral river; and ii) to assess annual sedimentation at the check-dam during a 4-year period. The study was conducted on a check-dam in the semi-arid island of Cyprus. Field campaigns were carried out to measure water flow, water depth and check-dam topography in order to establish check-dam water height, volume, evaporation, outflow and recharge relations. Topographic surveys were repeated at the end of consecutive hydrological years to estimate the sediment built up in the reservoir area of the check dam. Also, sediment samples were collected from the check-dam reservoir area for bulk-density analyses. To quantify the groundwater recharge, a water balance model was applied at two locations: at the check-dam and corresponding reservoir area, and at a 4-km stretch of the river bed without check-dam. Results showed that a check-dam with a storage capacity of 25,000 m3 was able to recharge to the aquifer, in four years, a total of 12 million m3 out of the 42 million m3 of measured (or modelled) streamflow. Recharge from the analyzed 4-km long river section without check-dam was estimated to be 1 million m3. Upper and lower limits of prediction intervals were computed to assess the uncertainties of the results. The model was rerun with these values and resulted in recharge values of 0.4 m3 as lower and 38 million m3 as upper limit. The sediment survey in the check-dam reservoir area showed that the reservoir area was filled with 2,000 to 3,000 tons of sediment after one rainfall season. This amount of sediment corresponds to 0.2 to 2 t h-1 y-1 sediment yield at the watershed level and reduces the check-dam storage capacity by approximately 10%. Results indicate that check-dams are valuable structures for increasing groundwater resources, but special attention should be given to soil erosion occurring in the upstream area and the resulting sediment built-up in the check-dam reservoir area. This study has received funding from the EU FP7 RECARE Project (GA 603498)

  19. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  20. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  1. A voice-actuated wind tunnel model leak checking system

    NASA Technical Reports Server (NTRS)

    Larson, William E.

    1989-01-01

    A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.

  2. Use of posterior predictive checks as an inferential tool for investigating individual heterogeneity in animal population vital rates

    PubMed Central

    Chambert, Thierry; Rotella, Jay J; Higgs, Megan D

    2014-01-01

    The investigation of individual heterogeneity in vital rates has recently received growing attention among population ecologists. Individual heterogeneity in wild animal populations has been accounted for and quantified by including individually varying effects in models for mark–recapture data, but the real need for underlying individual effects to account for observed levels of individual variation has recently been questioned by the work of Tuljapurkar et al. (Ecology Letters, 12, 93, 2009) on dynamic heterogeneity. Model-selection approaches based on information criteria or Bayes factors have been used to address this question. Here, we suggest that, in addition to model-selection, model-checking methods can provide additional important insights to tackle this issue, as they allow one to evaluate a model's misfit in terms of ecologically meaningful measures. Specifically, we propose the use of posterior predictive checks to explicitly assess discrepancies between a model and the data, and we explain how to incorporate model checking into the inferential process used to assess the practical implications of ignoring individual heterogeneity. Posterior predictive checking is a straightforward and flexible approach for performing model checks in a Bayesian framework that is based on comparisons of observed data to model-generated replications of the data, where parameter uncertainty is incorporated through use of the posterior distribution. If discrepancy measures are chosen carefully and are relevant to the scientific context, posterior predictive checks can provide important information allowing for more efficient model refinement. We illustrate this approach using analyses of vital rates with long-term mark–recapture data for Weddell seals and emphasize its utility for identifying shortfalls or successes of a model at representing a biological process or pattern of interest. We show how posterior predictive checks can be used to strengthen inferences in ecological studies. We demonstrate the application of this method on analyses dealing with the question of individual reproductive heterogeneity in a population of Antarctic pinnipeds. PMID:24834335

  3. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking

    PubMed Central

    Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems. PMID:27187178

  4. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    PubMed

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.

  5. Multiwavelength Thermometry at High Temperature: Why It is Advantageous to Work in the Ultraviolet

    NASA Astrophysics Data System (ADS)

    Girard, F.; Battuello, M.; Florio, M.

    2014-07-01

    In principle, multiwavelength radiation thermometry allows one to correctly measure the temperature of surfaces of unknown and varying surface emissivity. Unfortunately, none of the practical realizations proposed in the past proved to be sufficiently reliable because of a strong influence of the errors arising from incorrect modeling of the emissivity and of the limited number of operating wavelengths. The use of array detectors allows a high degree of flexibility both in terms of number and spectral position of the working wavelength bands. In the case of applications at high temperatures, i.e., near 2000 C or above, an analysis of the theoretical measuring principles of multiwavelength thermometry, suggests the opportunity of investigating the possible advantages in extending the operating wavelengths toward the ultraviolet region. To this purpose, a simulation program was developed which allows investigation of the effect of different influencing parameters. This paper presents a brief theoretical introduction and practical analysis of the method. The best choices are derived in terms of the different influencing parameters and data relative to the simulation of both real materials and fictitious emissivity curves and have been studied and analyzed with different emissivity models to check the robustness of the method.

  6. Model Checking Temporal Logic Formulas Using Sticker Automata

    PubMed Central

    Feng, Changwei; Wu, Huanmei

    2017-01-01

    As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114

  7. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  8. Full implementation of a distributed hydrological model based on check dam trapped sediment volumes

    NASA Astrophysics Data System (ADS)

    Bussi, Gianbattista; Francés, Félix

    2014-05-01

    Lack of hydrometeorological data is one of the most compelling limitations to the implementation of distributed environmental models. Mediterranean catchments, in particular, are characterised by high spatial variability of meteorological phenomena and soil characteristics, which may prevents from transferring model calibrations from a fully gauged catchment to a totally o partially ungauged one. For this reason, new sources of data are required in order to extend the use of distributed models to non-monitored or low-monitored areas. An important source of information regarding the hydrological and sediment cycle is represented by sediment deposits accumulated at the bottom of reservoirs. Since the 60s, reservoir sedimentation volumes were used as proxy data for the estimation of inter-annual total sediment yield rates, or, in more recent years, as a reference measure of the sediment transport for sediment model calibration and validation. Nevertheless, the possibility of using such data for constraining the calibration of a hydrological model has not been exhaustively investigated so far. In this study, the use of nine check dam reservoir sedimentation volumes for hydrological and sedimentological model calibration and spatio-temporal validation was examined. Check dams are common structures in Mediterranean areas, and are a potential source of spatially distributed information regarding both hydrological and sediment cycle. In this case-study, the TETIS hydrological and sediment model was implemented in a medium-size Mediterranean catchment (Rambla del Poyo, Spain) by taking advantage of sediment deposits accumulated behind the check dams located in the catchment headwaters. Reservoir trap efficiency was taken into account by coupling the TETIS model with a pond trap efficiency model. The model was calibrated by adjusting some of its parameters in order to reproduce the total sediment volume accumulated behind a check dam. Then, the model was spatially validated by obtaining the simulated sedimentation volume at the other eight check dams and comparing it to the observed sedimentation volumes. Lastly, the simulated water discharge at the catchment outlet was compared with observed water discharge records in order to check the hydrological sub-model behaviour. Model results provided highly valuable information concerning the spatial distribution of soil erosion and sediment transport. Spatial validation of the sediment sub-model provided very good results at seven check dams out of nine. This study shows that check dams can be a useful tool also for constraining hydrological model calibration, as model results agree with water discharge observations. In fact, the hydrological model validation at a downstream water flow gauge obtained a Nash-Sutcliffe efficiency of 0.8. This technique is applicable to all catchments with presence of check dams, and only requires rainfall and temperature data and soil characteristics maps.

  9. HAARP-Induced Ionospheric Ducts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milikh, Gennady; Vartanyan, Aram

    2011-01-04

    It is well known that strong electron heating by a powerful HF-facility can lead to the formation of electron and ion density perturbations that stretch along the magnetic field line. Those density perturbations can serve as ducts for ELF waves, both of natural and artificial origin. This paper presents observations of the plasma density perturbations caused by the HF-heating of the ionosphere by the HAARP facility. The low orbit satellite DEMETER was used as a diagnostic tool to measure the electron and ion temperature and density along the satellite orbit overflying close to the magnetic zenith of the HF-heater. Thosemore » observations will be then checked against the theoretical model of duct formation due to HF-heating of the ionosphere. The model is based on the modified SAMI2 code, and is validated by comparison with well documented experiments.« less

  10. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Johan, E-mail: anderson.johan@gmail.com; Halpern, Federico D.; Ricci, Paolo

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis ofmore » the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.« less

  11. Mock X-ray Observations of Localized LMC Outflows

    NASA Astrophysics Data System (ADS)

    Tomesh, Teague; Bustard, Chad; Zweibel, Ellen

    2018-01-01

    The Milky Way’s nearest neighbor, the Large Magellanic Cloud (LMC), is a perfect testing ground for modeling a variety of astrophysical phenomena. Specifically, the LMC provides a unique opportunity for the study of possible localized outflows driven by star formation and their x-ray signatures. We have developed FLASH simulations of theoretical outflows originating in the LMC that we have used to generate predicted observations of X-ray luminosity. This X-ray emission can be a useful probe of the hot gas in these winds which may couple to the cool gas and drive it from the disk. Future observations of the LMC may provide us with valuable checks on our model. This work is partially supported by the National Science Foundation (NSF) Graduate Research Fellowship Program under grant No. DGE-125625 and NSF grant No. AST-1616037.

  12. Modeling the focusing efficiency of lobster-eye optics for image shifting depending on the soft x-ray wavelength.

    PubMed

    Su, Luning; Li, Wei; Wu, Mingxuan; Su, Yun; Guo, Chongling; Ruan, Ningjuan; Yang, Bingxin; Yan, Feng

    2017-08-01

    Lobster-eye optics is widely applied to space x-ray detection missions and x-ray security checks for its wide field of view and low weight. This paper presents a theoretical model to obtain spatial distribution of focusing efficiency based on lobster-eye optics in a soft x-ray wavelength. The calculations reveal the competition mechanism of contributions to the focusing efficiency between the geometrical parameters of lobster-eye optics and the reflectivity of the iridium film. In addition, the focusing efficiency image depending on x-ray wavelengths further explains the influence of different geometrical parameters of lobster-eye optics and different soft x-ray wavelengths on focusing efficiency. These results could be beneficial to optimize parameters of lobster-eye optics in order to realize maximum focusing efficiency.

  13. Towards Symbolic Model Checking for Multi-Agent Systems via OBDDs

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Lomunscio, Alessio

    2004-01-01

    We present an algorithm for model checking temporal-epistemic properties of multi-agent systems, expressed in the formalism of interpreted systems. We first introduce a technique for the translation of interpreted systems into boolean formulae, and then present a model-checking algorithm based on this translation. The algorithm is based on OBDD's, as they offer a compact and efficient representation for boolean formulae.

  14. An analytic solution for numerical modeling validation in electromagnetics: the resistive sphere

    NASA Astrophysics Data System (ADS)

    Swidinsky, Andrei; Liu, Lifei

    2017-11-01

    We derive the electromagnetic response of a resistive sphere to an electric dipole source buried in a conductive whole space. The solution consists of an infinite series of spherical Bessel functions and associated Legendre polynomials, and follows the well-studied problem of a conductive sphere buried in a resistive whole space in the presence of a magnetic dipole. Our result is particularly useful for controlled-source electromagnetic problems using a grounded electric dipole transmitter and can be used to check numerical methods of calculating the response of resistive targets (such as finite difference, finite volume, finite element and integral equation). While we elect to focus on the resistive sphere in our examples, the expressions in this paper are completely general and allow for arbitrary source frequency, sphere radius, transmitter position, receiver position and sphere/host conductivity contrast so that conductive target responses can also be checked. Commonly used mesh validation techniques consist of comparisons against other numerical codes, but such solutions may not always be reliable or readily available. Alternatively, the response of simple 1-D models can be tested against well-known whole space, half-space and layered earth solutions, but such an approach is inadequate for validating models with curved surfaces. We demonstrate that our theoretical results can be used as a complementary validation tool by comparing analytic electric fields to those calculated through a finite-element analysis; the software implementation of this infinite series solution is made available for direct and immediate application.

  15. Checking of individuality by DNA profiling.

    PubMed

    Brdicka, R; Nürnberg, P

    1993-08-25

    A review of methods of DNA analysis used in forensic medicine for identification, paternity testing, etc. is provided. Among other techniques, DNA fingerprinting using different probes and polymerase chain reaction-based techniques such as amplified sequence polymorphisms and minisatellite variant repeat mapping are thoroughly described and both theoretical and practical aspects are discussed.

  16. An educational partnership in health promotion for pre-registration nurses and further education college students.

    PubMed

    Abbott, Stephen; Thomas, Nicki; Apau, Daniel; Benato, Rosa; Hicks, Siobhan; MacKenzie, Karin

    2012-07-01

    This paper describes a partnership between a university and a college of further education, whereby first-year nursing students administered health checks to college students. Despite many challenges, the experience was positive for both sets of students and has been mainstreamed. Many lessons were learnt about how best to support nursing students to ensure a good quality experience for both student groups. Data gained from the health checks are also presented, and the programme is compared with the brief community placement that previous nursing students had undertaken at this stage of their training. Theoretical underpinnings for the programme are discussed.

  17. PyXRD v0.6.7: a free and open-source program to quantify disordered phyllosilicates using multi-specimen X-ray diffraction profile fitting

    NASA Astrophysics Data System (ADS)

    Dumon, M.; Van Ranst, E.

    2016-01-01

    This paper presents a free and open-source program called PyXRD (short for Python X-ray diffraction) to improve the quantification of complex, poly-phasic mixed-layer phyllosilicate assemblages. The validity of the program was checked by comparing its output with Sybilla v2.2.2, which shares the same mathematical formalism. The novelty of this program is the ab initio incorporation of the multi-specimen method, making it possible to share phases and (a selection of) their parameters across multiple specimens. PyXRD thus allows for modelling multiple specimens side by side, and this approach speeds up the manual refinement process significantly. To check the hypothesis that this multi-specimen set-up - as it effectively reduces the number of parameters and increases the number of observations - can also improve automatic parameter refinements, we calculated X-ray diffraction patterns for four theoretical mineral assemblages. These patterns were then used as input for one refinement employing the multi-specimen set-up and one employing the single-pattern set-ups. For all of the assemblages, PyXRD was able to reproduce or approximate the input parameters with the multi-specimen approach. Diverging solutions only occurred in single-pattern set-ups, which do not contain enough information to discern all minerals present (e.g. patterns of heated samples). Assuming a correct qualitative interpretation was made and a single pattern exists in which all phases are sufficiently discernible, the obtained results indicate a good quantification can often be obtained with just that pattern. However, these results from theoretical experiments cannot automatically be extrapolated to all real-life experiments. In any case, PyXRD has proven to be useful when X-ray diffraction patterns are modelled for complex mineral assemblages containing mixed-layer phyllosilicates with a multi-specimen approach.

  18. Dynamic Analysis and Control of Lightweight Manipulators with Flexible Parallel Link Mechanisms. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Lee, Jeh Won

    1990-01-01

    The objective is the theoretical analysis and the experimental verification of dynamics and control of a two link flexible manipulator with a flexible parallel link mechanism. Nonlinear equations of motion of the lightweight manipulator are derived by the Lagrangian method in symbolic form to better understand the structure of the dynamic model. The resulting equation of motion have a structure which is useful to reduce the number of terms calculated, to check correctness, or to extend the model to higher order. A manipulator with a flexible parallel link mechanism is a constrained dynamic system whose equations are sensitive to numerical integration error. This constrained system is solved using singular value decomposition of the constraint Jacobian matrix. Elastic motion is expressed by the assumed mode method. Mode shape functions of each link are chosen using the load interfaced component mode synthesis. The discrepancies between the analytical model and the experiment are explained using a simplified and a detailed finite element model.

  19. ANSYS duplicate finite-element checker routine

    NASA Technical Reports Server (NTRS)

    Ortega, R.

    1995-01-01

    An ANSYS finite-element code routine to check for duplicated elements within the volume of a three-dimensional (3D) finite-element mesh was developed. The routine developed is used for checking floating elements within a mesh, identically duplicated elements, and intersecting elements with a common face. A space shuttle main engine alternate turbopump development high pressure oxidizer turbopump finite-element model check using the developed subroutine is discussed. Finally, recommendations are provided for duplicate element checking of 3D finite-element models.

  20. Nonequilibrium Tricritical Point in a System with Long-Range Interactions

    NASA Astrophysics Data System (ADS)

    Antoniazzi, Andrea; Fanelli, Duccio; Ruffo, Stefano; Yamaguchi, Yoshiyuki Y.

    2007-07-01

    Systems with long-range interactions display a short-time relaxation towards quasistationary states whose lifetime increases with system size. With reference to the Hamiltonian mean field model, we here show that a maximum entropy principle, based on Lynden-Bell’s pioneering idea of “violent relaxation,” predicts the presence of out-of-equilibrium phase transitions separating the relaxation towards homogeneous (zero magnetization) or inhomogeneous (nonzero magnetization) quasistationary states. When varying the initial condition within a family of “water bags” with different initial magnetization and energy, first- and second-order phase transition lines are found that merge at an out-of-equilibrium tricritical point. Metastability is theoretically predicted and numerically checked around the first-order phase transition line.

  1. A projection operator method for the analysis of magnetic neutron form factors

    NASA Astrophysics Data System (ADS)

    Kaprzyk, S.; Van Laar, B.; Maniawski, F.

    1981-03-01

    A set of projection operators in matrix form has been derived on the basis of decomposition of the spin density into a series of fully symmetrized cubic harmonics. This set of projection operators allows a formulation of the Fourier analysis of magnetic form factors in a convenient way. The presented method is capable of checking the validity of various theoretical models used for spin density analysis up to now. The general formalism is worked out in explicit form for the fcc and bcc structures and deals with that part of spin density which is contained within the sphere inscribed in the Wigner-Seitz cell. This projection operator method has been tested on the magnetic form factors of nickel and iron.

  2. Analyzing fragment production in mass-asymmetric reactions as a function of density dependent part of symmetry energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaur, Amandeep; Deepshikha; Vinayak, Karan Singh

    2016-07-15

    We performed a theoretical investigation of different mass-asymmetric reactions to access the direct impact of the density-dependent part of symmetry energy on multifragmentation. The simulations are performed for a specific set of reactions having same system mass and N/Z content, using isospin-dependent quantum molecular dynamics model to estimate the quantitative dependence of fragment production on themass-asymmetry factor (τ) for various symmetry energy forms. The dynamics associated with different mass-asymmetric reactions is explored and the direct role of symmetry energy is checked. Also a comparison with the experimental data (asymmetric reaction) is presented for a different equation of states (symmetry energymore » forms).« less

  3. Corrections on the Thermometer Reading in an Air Stream

    NASA Technical Reports Server (NTRS)

    Van Der Maas, H J; Wynia, S

    1940-01-01

    A method is described for checking a correction formula, based partly on theoretical considerations, for adiabatic compression and friction in flight tests and determining the value of the constant. It is necessary to apply a threefold correction to each thermometer reading. They are a correction for adiabatic compression, friction and for time lag.

  4. Model Checking the Remote Agent Planner

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This work tackles the problem of using Model Checking for the purpose of verifying the HSTS (Scheduling Testbed System) planning system. HSTS is the planner and scheduler of the remote agent autonomous control system deployed in Deep Space One (DS1). Model Checking allows for the verification of domain models as well as planning entries. We have chosen the real-time model checker UPPAAL for this work. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a sketch for the mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify.

  5. Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration

    NASA Technical Reports Server (NTRS)

    Groce, Alex; Joshi, Rajeev

    2008-01-01

    Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.

  6. CheckMATE 2: From the model to the limit

    NASA Astrophysics Data System (ADS)

    Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten

    2017-12-01

    We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.

  7. Criticality of the random field Ising model in and out of equilibrium: A nonperturbative functional renormalization group description

    NASA Astrophysics Data System (ADS)

    Balog, Ivan; Tarjus, Gilles; Tissier, Matthieu

    2018-03-01

    We show that, contrary to previous suggestions based on computer simulations or erroneous theoretical treatments, the critical points of the random-field Ising model out of equilibrium, when quasistatically changing the applied source at zero temperature, and in equilibrium are not in the same universality class below some critical dimension dD R≈5.1 . We demonstrate this by implementing a nonperturbative functional renormalization group for the associated dynamical field theory. Above dD R, the avalanches, which characterize the evolution of the system at zero temperature, become irrelevant at large distance, and hysteresis and equilibrium critical points are then controlled by the same fixed point. We explain how to use computer simulation and finite-size scaling to check the correspondence between in and out of equilibrium criticality in a far less ambiguous way than done so far.

  8. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  9. Coverage Metrics for Model Checking

    NASA Technical Reports Server (NTRS)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  10. Interaction of a weak shock wave with a discontinuous heavy-gas cylinder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xiansheng; Yang, Dangguo; Wu, Junqiang

    2015-06-15

    The interaction between a cylindrical inhomogeneity and a weak planar shock wave is investigated experimentally and numerically, and special attention is given to the wave patterns and vortex dynamics in this scenario. A soap-film technique is realized to generate a well-controlled discontinuous cylinder (SF{sub 6} surrounded by air) with no supports or wires in the shock-tube experiment. The symmetric evolving interfaces and few disturbance waves are observed in a high-speed schlieren photography. Numerical simulations are also carried out for a detailed analysis. The refracted shock wave inside the cylinder is perturbed by the diffracted shock waves and divided into threemore » branches. When these shock branches collide, the shock focusing occurs. A nonlinear model is then proposed to elucidate effects of the wave patterns on the evolution of the cylinder. A distinct vortex pair is gradually developing during the shock-cylinder interaction. The numerical results show that a low pressure region appears at the vortex core. Subsequently, the ambient fluid is entrained into the vortices which are expanding at the same time. Based on the relation between the vortex motion and the circulation, several theoretical models of circulation in the literature are then checked by the experimental and numerical results. Most of these theoretical circulation models provide a reasonably good prediction of the vortex motion in the present configuration.« less

  11. A new car-following model for autonomous vehicles flow with mean expected velocity field

    NASA Astrophysics Data System (ADS)

    Wen-Xing, Zhu; Li-Dong, Zhang

    2018-02-01

    Due to the development of the modern scientific technology, autonomous vehicles may realize to connect with each other and share the information collected from each vehicle. An improved forward considering car-following model was proposed with mean expected velocity field to describe the autonomous vehicles flow behavior. The new model has three key parameters: adjustable sensitivity, strength factor and mean expected velocity field size. Two lemmas and one theorem were proven as criteria for judging the stability of homogeneousautonomous vehicles flow. Theoretical results show that the greater parameters means larger stability regions. A series of numerical simulations were carried out to check the stability and fundamental diagram of autonomous flow. From the numerical simulation results, the profiles, hysteresis loop and density waves of the autonomous vehicles flow were exhibited. The results show that with increased sensitivity, strength factor or field size the traffic jam was suppressed effectively which are well in accordance with the theoretical results. Moreover, the fundamental diagrams corresponding to three parameters respectively were obtained. It demonstrates that these parameters play almost the same role on traffic flux: i.e. before the critical density the bigger parameter is, the greater flux is and after the critical density, the opposite tendency is. In general, the three parameters have a great influence on the stability and jam state of the autonomous vehicles flow.

  12. Posterior Predictive Model Checking in Bayesian Networks

    ERIC Educational Resources Information Center

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  13. Analyzing the cost of screening selectee and non-selectee baggage.

    PubMed

    Virta, Julie L; Jacobson, Sheldon H; Kobza, John E

    2003-10-01

    Determining how to effectively operate security devices is as important to overall system performance as developing more sensitive security devices. In light of recent federal mandates for 100% screening of all checked baggage, this research studies the trade-offs between screening only selectee checked baggage and screening both selectee and non-selectee checked baggage for a single baggage screening security device deployed at an airport. This trade-off is represented using a cost model that incorporates the cost of the baggage screening security device, the volume of checked baggage processed through the device, and the outcomes that occur when the device is used. The cost model captures the cost of deploying, maintaining, and operating a single baggage screening security device over a one-year period. The study concludes that as excess baggage screening capacity is used to screen non-selectee checked bags, the expected annual cost increases, the expected annual cost per checked bag screened decreases, and the expected annual cost per expected number of threats detected in the checked bags screened increases. These results indicate that the marginal increase in security per dollar spent is significantly lower when non-selectee checked bags are screened than when only selectee checked bags are screened.

  14. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...

  15. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...

  16. Ethylenediammonium dication: H-bonded complexes with terephthalate, chloroacetate, phosphite, selenite and sulfamate anions. Detailed vibrational spectroscopic and theoretical studies of ethylenediammonium terephthalate

    NASA Astrophysics Data System (ADS)

    Marchewka, M. K.; Drozd, M.

    2012-12-01

    Crystalline complexes between ethylenediammonium dication and terephthalate, chloroacetate, phosphite, selenite and sulfamate anions were obtained by slow evaporation from water solution method. Room temperature powder infrared and Raman measurements were carried out. For ethylenediammonium terephthalate theoretical calculations of structure were performed by two ways: ab-initio HF and semiempirical PM3. In this case the PM3 method gave more accurate structure (closer to X-ray results). The additional PM3 calculations of vibrational spectra were performed. On the basis theoretical approach and earlier vibrational studies of similar compounds the vibrational assignments for observed bands have been proposed. All compounds were checked for second harmonic generation (SHG).

  17. A complete VLBI delay model for deforming radio telescopes: the Effelsberg case

    NASA Astrophysics Data System (ADS)

    Artz, T.; Springer, A.; Nothnagel, A.

    2014-12-01

    Deformations of radio telescopes used in geodetic and astrometric very long baseline interferometry (VLBI) observations belong to the class of systematic error sources which require correction in data analysis. In this paper we present a model for all path length variations in the geometrical optics of radio telescopes which are due to gravitational deformation. The Effelsberg 100 m radio telescope of the Max Planck Institute for Radio Astronomy, Bonn, Germany, has been surveyed by various terrestrial methods. Thus, all necessary information that is needed to model the path length variations is available. Additionally, a ray tracing program has been developed which uses as input the parameters of the measured deformations to produce an independent check of the theoretical model. In this program as well as in the theoretical model, the illumination function plays an important role because it serves as the weighting function for the individual path lengths depending on the distance from the optical axis. For the Effelsberg telescope, the biggest contribution to the total path length variations is the bending of the main beam located along the elevation axis which partly carries the weight of the paraboloid at its vertex. The difference in total path length is almost 100 mm when comparing observations at 90 and at 0 elevation angle. The impact of the path length corrections is validated in a global VLBI analysis. The application of the correction model leads to a change in the vertical position of mm. This is more than the maximum path length, but the effect can be explained by the shape of the correction function.

  18. Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking

    NASA Technical Reports Server (NTRS)

    Turgeon, Gregory; Price, Petra

    2010-01-01

    A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.

  19. MPST Software: grl_pef_check

    NASA Technical Reports Server (NTRS)

    Call, Jared A.; Kwok, John H.; Fisher, Forest W.

    2013-01-01

    This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.

  20. Economic inequality and mobility in kinetic models for social sciences

    NASA Astrophysics Data System (ADS)

    Letizia Bertotti, Maria; Modanese, Giovanni

    2016-10-01

    Statistical evaluations of the economic mobility of a society are more difficult than measurements of the income distribution, because they require to follow the evolution of the individuals' income for at least one or two generations. In micro-to-macro theoretical models of economic exchanges based on kinetic equations, the income distribution depends only on the asymptotic equilibrium solutions, while mobility estimates also involve the detailed structure of the transition probabilities of the model, and are thus an important tool for assessing its validity. Empirical data show a remarkably general negative correlation between economic inequality and mobility, whose explanation is still unclear. It is therefore particularly interesting to study this correlation in analytical models. In previous work we investigated the behavior of the Gini inequality index in kinetic models in dependence on several parameters which define the binary interactions and the taxation and redistribution processes: saving propensity, taxation rates gap, tax evasion rate, welfare means-testing etc. Here, we check the correlation of mobility with inequality by analyzing the mobility dependence from the same parameters. According to several numerical solutions, the correlation is confirmed to be negative.

  1. Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking

    NASA Technical Reports Server (NTRS)

    Cavada, Roberto; Pecheur, Charles

    2003-01-01

    This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.

  2. Simulation of RIRS in soft cadavers: a novel training model by the Cadaveric Research On Endourology Training (CRET) Study Group.

    PubMed

    Huri, Emre; Skolarikos, Andreas; Tatar, İlkan; Binbay, Murat; Sofikerim, Mustafa; Yuruk, Emrah; Karakan, Tolga; Sargon, Mustafa; Demiryurek, Deniz; Miano, Roberto; Bagcioglu, Murat; Ezer, Mehmet; Cracco, Cecilia Maria; Scoffone, Cesare Marco

    2016-05-01

    The aim of the current study was to evaluate the use of fresh-frozen concurrently with embalmed cadavers as initial training models for flexible ureteroscopy (fURS) in a group of urologists who were inexperienced in retrograde intrarenal surgery (RIRS). Twelve urologists involved in a cadaveric fURS training course were enrolled into this prospective study. All the participants were inexperienced in fURS. Theoretical lectures and step-by-step tips and tricks video presentations on fURS were used to incorporate the technical background of the procedure to the hands-on-training course and to standardize the operating steps of the procedure. An 8-item survey was administered to the participants upon initiation and at the end of the course. Pre- and post-training scores were similar for each question. All the participants successfully completed the hands-on-training tasks. Mean pre-training duration [3.56 ± 2.0 min (range 1.21-7.46)] was significantly higher than mean post-training duration [1.76 ± 1.54 min (range 1.00-6.34)] (p = 0.008). At the end of the day, the trainers checked the integrity of the collecting system both by endoscopy and by fluoroscopy and could not detect any injury of the upper ureteral wall or pelvicalyceal structures. The functionality of the scopes was also checked, and no scope injury (including a reduction in the deflection capacity) was noted. The fURS simulation training model using soft human cadavers has the unique advantage of perfectly mimicking the living human tissues. This similarity makes this model one of the best if not the perfect simulator for an effective endourologic training.

  3. "Are They Just Checking Our Obesity or What?" The Healthism Discourse and Rural Young Women

    ERIC Educational Resources Information Center

    Lee, Jessica; Macdonald, Doune

    2010-01-01

    This paper makes use of critical discourse analysis and Bourdieu's theoretical framework to explore rural young women's meanings of health and fitness and how the healthism discourse is perpetuated through their experiences in school physical education (PE). The young women's own meanings are explored alongside interview data from their school PE…

  4. New muonic-atom test of vacuum polarization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixit, M.S.; Carter, A.L.; Hincks, E.P.

    1975-12-15

    In order to check the discrepancy between calculation and experiment in muonic atoms, we have remeasured the 5g-4f transitions in Pb and the 5g-4f and the 4f-3d transitions in Ba. Our new results show no discrepancy and confirm recent theoretical calculations of vacuum polarization to within 0.5%. (AIP)

  5. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  6. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  7. Model Checking - My 27-Year Quest to Overcome the State Explosion Problem

    NASA Technical Reports Server (NTRS)

    Clarke, Ed

    2009-01-01

    Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.

  8. A Multidimensional Item Response Model: Constrained Latent Class Analysis Using the Gibbs Sampler and Posterior Predictive Checks.

    ERIC Educational Resources Information Center

    Hoijtink, Herbert; Molenaar, Ivo W.

    1997-01-01

    This paper shows that a certain class of constrained latent class models may be interpreted as a special case of nonparametric multidimensional item response models. Parameters of this latent class model are estimated using an application of the Gibbs sampler, and model fit is investigated using posterior predictive checks. (SLD)

  9. Construct validity and reliability of the Single Checking Administration of Medications Scale.

    PubMed

    O'Connell, Beverly; Hawkins, Mary; Ockerby, Cherene

    2013-06-01

    Research indicates that single checking of medications is as safe as double checking; however, many nurses are averse to independently checking medications. To assist with the introduction and use of single checking, a measure of nurses' attitudes, the thirteen-item Single Checking Administration of Medications Scale (SCAMS) was developed. We examined the psychometric properties of the SCAMS. Secondary analyses were conducted on data collected from 503 nurses across a large Australian health-care service. Analyses using exploratory and confirmatory factor analyses supported by structural equation modelling resulted in a valid twelve-item SCAMS containing two reliable subscales, the nine-item Attitudes towards single checking and three-item Advantages of single checking subscales. The SCAMS is recommended as a valid and reliable measure for monitoring nurses' attitudes to single checking prior to introducing single checking medications and after its implementation. © 2013 Wiley Publishing Asia Pty Ltd.

  10. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  11. The Priority Inversion Problem and Real-Time Symbolic Model Checking

    DTIC Science & Technology

    1993-04-23

    real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties

  12. Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea

    NASA Astrophysics Data System (ADS)

    Jun, K.; Tak, W.; JUN, B. H.; Lee, H. J.; KIM, S. D.

    2016-12-01

    Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea Kye-Won Jun*, Won-Jun Tak*, Byong-Hee Jun**, Ho-Jin Lee***, Soung-Doug Kim* *Graduate School of Disaster Prevention, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea **School of Fire and Disaster Protection, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea ***School of Civil Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Korea Abstract As more than 64% of the land in South Korea is mountainous area, so many regions in South Korea are exposed to the danger of landslide and debris flow. So it is important to understand the behavior of debris flow in mountainous terrains, the various methods and models are being presented and developed based on the mathematical concept. The purpose of this study is to investigate the regions that experienced the debris flow due to typhoon called Ewiniar and to perform numerical modeling to design and layout of the Check dam for reducing the damage by the debris flow. For the performance of numerical modeling, on-site measurement of the research area was conducted including: topographic investigation, research on bridges in the downstream, and precision LiDAR 3D scanning for composed basic data of numerical modeling. The numerical simulation of this study was performed using RAMMS (Rapid Mass Movements Simulation) model for the analysis of the debris flow. This model applied to the conditions of the Check dam which was installed in the upstream, midstream, and downstream. Considering the reduction effect of debris flow, the expansion of debris flow, and the influence on the bridges in the downstream, proper location of the Check dam was designated. The result of present numerical model showed that when the Check dam was installed in the downstream section, 50 m above the bridge, the reduction effect of the debris flow was higher compared to when the Check dam were installed in other sections. Key words: Debris flow, LiDAR, Check dam, RAMMSAcknowledgementsThis research was supported by a grant [MPSS-NH-2014-74] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government

  13. Model building strategy for logistic regression: purposeful selection.

    PubMed

    Zhang, Zhongheng

    2016-03-01

    Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.

  14. HiVy automated translation of stateflow designs for model checking verification

    NASA Technical Reports Server (NTRS)

    Pingree, Paula

    2003-01-01

    tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.

  15. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation

    PubMed Central

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449

  16. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation.

    PubMed

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.

  17. Arbitrary-order corrections for finite-time drift and diffusion coefficients

    NASA Astrophysics Data System (ADS)

    Anteneodo, C.; Riera, R.

    2009-09-01

    We address a standard class of diffusion processes with linear drift and quadratic diffusion coefficients. These contributions to dynamic equations can be directly drawn from data time series. However, real data are constrained to finite sampling rates and therefore it is crucial to establish a suitable mathematical description of the required finite-time corrections. Based on Itô-Taylor expansions, we present the exact corrections to the finite-time drift and diffusion coefficients. These results allow to reconstruct the real hidden coefficients from the empirical estimates. We also derive higher-order finite-time expressions for the third and fourth conditional moments that furnish extra theoretical checks for this class of diffusion models. The analytical predictions are compared with the numerical outcomes of representative artificial time series.

  18. Immediate Effects of Body Checking Behaviour on Negative and Positive Emotions in Women with Eating Disorders: An Ecological Momentary Assessment Approach.

    PubMed

    Kraus, Nicole; Lindenberg, Julia; Zeeck, Almut; Kosfelder, Joachim; Vocks, Silja

    2015-09-01

    Cognitive-behavioural models of eating disorders state that body checking arises in response to negative emotions in order to reduce the aversive emotional state and is therefore negatively reinforced. This study empirically tests this assumption. For a seven-day period, women with eating disorders (n = 26) and healthy controls (n = 29) were provided with a handheld computer for assessing occurring body checking strategies as well as negative and positive emotions. Serving as control condition, randomized computer-emitted acoustic signals prompted reports on body checking and emotions. There was no difference in the intensity of negative emotions before body checking and in control situations across groups. However, from pre- to post-body checking, an increase in negative emotions was found. This effect was more pronounced in women with eating disorders compared with healthy controls. Results are contradictory to the assumptions of the cognitive-behavioural model, as body checking does not seem to reduce negative emotions. Copyright © 2015 John Wiley & Sons, Ltd and Eating Disorders Association.

  19. Gauge-transformation properties of cosmological observables and its application to the light-cone average

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jaiyul; Durrer, Ruth, E-mail: jyoo@physik.uzh.ch, E-mail: ruth.durrer@unige.ch

    Theoretical descriptions of observable quantities in cosmological perturbation theory should be independent of coordinate systems. This statement is often referred to as gauge-invariance of observable quantities, and the sanity of their theoretical description is verified by checking its gauge-invariance. We argue that cosmological observables are invariant scalars under diffeomorphisms and their theoretical description is gauge-invariant, only at linear order in perturbations. Beyond linear order, they are usually not gauge-invariant, and we provide the general law for the gauge-transformation that the perturbation part of an observable does obey. We apply this finding to derive the second-order expression for the observational light-conemore » average in cosmology and demonstrate that our expression is indeed invariant under diffeomorphisms.« less

  20. Ethylenediammonium dication: H-bonded complexes with terephthalate, chloroacetate, phosphite, selenite and sulfamate anions. Detailed vibrational spectroscopic and theoretical studies of ethylenediammonium terephthalate.

    PubMed

    Marchewka, M K; Drozd, M

    2012-12-01

    Crystalline complexes between ethylenediammonium dication and terephthalate, chloroacetate, phosphite, selenite and sulfamate anions were obtained by slow evaporation from water solution method. Room temperature powder infrared and Raman measurements were carried out. For ethylenediammonium terephthalate theoretical calculations of structure were performed by two ways: ab-initio HF and semiempirical PM3. In this case the PM3 method gave more accurate structure (closer to X-ray results). The additional PM3 calculations of vibrational spectra were performed. On the basis theoretical approach and earlier vibrational studies of similar compounds the vibrational assignments for observed bands have been proposed. All compounds were checked for second harmonic generation (SHG). Copyright © 2012 Elsevier B.V. All rights reserved.

  1. The AB Doradus system revisited: The dynamical mass of AB Dor A/C

    NASA Astrophysics Data System (ADS)

    Azulay, R.; Guirado, J. C.; Marcaide, J. M.; Martí-Vidal, I.; Ros, E.; Tognelli, E.; Jauncey, D. L.; Lestrade, J.-F.; Reynolds, J. E.

    2017-10-01

    Context. The study of pre-main-sequence (PMS) stars with model-independent measurements of their masses is essential to check the validity of theoretical models of stellar evolution. The well-known PMS binary AB Dor A/C is an important benchmark for this task, since it displays intense and compact radio emission, which makes possible the application of high-precision astrometric techniques to this system. Aims: We aim to revisit the dynamical masses of the components of AB Dor A/C to refine earlier comparisons between the measurements of stellar parameters and the predictions of stellar models. Methods: We observed in phase-reference mode the binary AB Dor A/C, 0.2'' separation, with the Australian Long Baseline Array at 8.4 GHz. The astrometric information resulting from our observations was analyzed along with previously reported VLBI, optical (Hipparcos), and infrared measurements. Results: The main star AB Dor A is clearly detected in all the VLBI observations, which allowed us to analyze the orbital motion of the system and to obtain model-independent dynamical masses of 0.90 ± 0.08 M⊙ and 0.090 ± 0.008 M⊙, for AB Dor A and AB Dor C, respectively. Comparisons with PMS stellar evolution models favor and age of 40-50 Myr for AB Dor A and of 25-120 Myr for AB Dor C. Conclusions: We show that the orbital motion of the AB Dor A/C system is remarkably well determined, leading to precise estimates of the dynamical masses. Comparison of our results with the prediction of evolutionary models support the observational evidence that theoretical models tend to slightly underestimate the mass of the low-mass stars.

  2. Model Diagnostics for Bayesian Networks

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2006-01-01

    Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…

  3. The dopamine D2/D3 receptor agonist quinpirole increases checking-like behaviour in an operant observing response task with uncertain reinforcement: A novel possible model of OCD?

    PubMed Central

    Eagle, Dawn M.; Noschang, Cristie; d’Angelo, Laure-Sophie Camilla; Noble, Christie A.; Day, Jacob O.; Dongelmans, Marie Louise; Theobald, David E.; Mar, Adam C.; Urcelay, Gonzalo P.; Morein-Zamir, Sharon; Robbins, Trevor W.

    2014-01-01

    Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an ‘observing’ lever for information about the location of an ‘active’ lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5 mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. PMID:24406720

  4. Perpetual Model Validation

    DTIC Science & Technology

    2017-03-01

    models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems

  5. Anomalous cross-modulation between microwave beams

    NASA Astrophysics Data System (ADS)

    Ranfagni, Anedio; Mugnai, Daniela; Petrucci, Andrea; Mignani, Roberto; Cacciari, Ilaria

    2018-06-01

    An anomalous effect in the near field of crossing microwave beams, which consists of an unexpected transfer of modulation from one beam to the other, has found a plausible interpretation within the framework of a locally broken Lorentz invariance. A theoretical approach of this kind deserves to be reconsidered also in the light of further experimental work, including a counter-check of the phenomenon.

  6. Variable generalized Chaplygin gas in a 5D cosmology

    NASA Astrophysics Data System (ADS)

    Salti, Mustafa; Aydogdu, Oktay; Yanar, Hilmi; Sogut, Kenan

    2018-03-01

    We construct the variable generalized Chaplygin gas (VGCG) defining a unified dark matter-energy scenario and investigate its essential cosmological properties in a universe governed by the Kaluza-Klein (KK) theory. A possible theoretical basis for the VGCG in the KK cosmology is argued. Also, we check the validity of thermodynamical laws and reimplement dynamics of tachyons in the KK universe.

  7. Principals' Perceptions of Teacher Attrition in Indiana Catholic Schools, Checking for Agreement with Ingersoll's Theoretical Framework on Teacher Attrition in Private Schools

    ERIC Educational Resources Information Center

    Brettnacher, Joseph A.

    2012-01-01

    Problem. Some Catholic schools report high teacher attrition rates. Understanding reasons for teacher attrition and responding to those issues is one of the many responsibilities of principals. However, it is unclear what Catholic principals understand about teacher attrition. Ingersoll's extensive research on teacher attrition has provided a…

  8. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  9. From RHIC to LHC: Lessons on the QGP

    NASA Astrophysics Data System (ADS)

    Heinz, Ulrich

    2011-10-01

    Recent data from heavy-ion collisions at RHIC and LHC, together with significant advances in theory, have allowed us to make significant first steps in proceeding from a qualitative understanding of high energy collision dynamics to a quantitative characterization of the transport properties of the hot and dense QCD matter created in these collisions. The almost perfectly liquid nature of the Quark-Gluon Plasma (QGP) created at RHIC has recently also been confirmed at the much higher LHC energies, and we can now constrain the specific QGP shear viscosity (η / s) QGP to within a factor of 2.5 of its conjectured lower quantum bound. Viscous hydrodynamics, coupled to a microscopic hadron cascade at late times, has proven to be an extremely successful and highly predictive model for the QGP evolution at RHIC and LHC. The experimental discovery of higher order harmonic flow coefficients and their theoretically predicted differential sensitivity to shear viscosity promises additional gains in precision by about a factor 5 in (η / s) QGP for the very near future. The observed modification of jets and suppression of high-pT hadrons confirms the picture of the QGP as a strongly coupled colored liquid, and recent LHC data yield strong constraints on parton energy loss models, putting significant strain on some theoretical approaches, tuned to RHIC data, that are based on leading-order perturbative QCD. Thermal photon radiation provides important cross-checks on the early stages of dynamical evolution models and constrains the initial QGP temperature, but the recently measured strong photon elliptic flow challenges our present understanding of photon emission rates in the hadronic phase. Recent progress in developing a complete theoretical model for all stages of the QGP fireball expansion, from strong fluctuating gluon fields at its beginning to final hadronic freeze-out, and remaining challenges will be discussed. Work supported by DOE (grants DE-SC0004286 and DE-SC0004104 (JET Collaboration)).

  10. On quality control procedures for solar radiation and meteorological measures, from subhourly to montly average time periods

    NASA Astrophysics Data System (ADS)

    Espinar, B.; Blanc, P.; Wald, L.; Hoyer-Klick, C.; Schroedter-Homscheidt, M.; Wanderer, T.

    2012-04-01

    Meteorological data measured by ground stations are often a key element in the development and validation of methods exploiting satellite images. These data are considered as a reference against which satellite-derived estimates are compared. Long-term radiation and meteorological measurements are available from a large number of measuring stations. However, close examination of the data often reveals a lack of quality, often for extended periods of time. This lack of quality has been the reason, in many cases, of the rejection of large amount of available data. The quality data must be checked before their use in order to guarantee the inputs for the methods used in modelling, monitoring, forecast, etc. To control their quality, data should be submitted to several conditions or tests. After this checking, data that are not flagged by any of the test is released as a plausible data. In this work, it has been performed a bibliographical research of quality control tests for the common meteorological variables (ambient temperature, relative humidity and wind speed) and for the usual solar radiometrical variables (horizontal global and diffuse components of the solar radiation and the beam normal component). The different tests have been grouped according to the variable and the average time period (sub-hourly, hourly, daily and monthly averages). The quality test may be classified as follows: • Range checks: test that verify values are within a specific range. There are two types of range checks, those based on extrema and those based on rare observations. • Step check: test aimed at detecting unrealistic jumps or stagnation in the time series. • Consistency checks: test that verify the relationship between two or more time series. The gathered quality tests are applicable for all latitudes as they have not been optimized regionally nor seasonably with the aim of being generic. They have been applied to ground measurements in several geographic locations, what result in the detection of some control tests that are no longer adequate, due to different reasons. After the modification of some test, based in our experience, a set of quality control tests is now presented, updated according to technology advances and classified. The presented set of quality tests allows radiation and meteorological data to be tested in order to know their plausibility to be used as inputs in theoretical or empirical methods for scientific research. The research leading to those results has partly receive funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 262892 (ENDORSE project).

  11. Model analysis of check dam impacts on long-term sediment and water budgets in southeast Arizona, USA

    USGS Publications Warehouse

    Norman, Laura M.; Niraula, Rewati

    2016-01-01

    The objective of this study was to evaluate the effect of check dam infrastructure on soil and water conservation at the catchment scale using the Soil and Water Assessment Tool (SWAT). This paired watershed study includes a watershed treated with over 2000 check dams and a Control watershed which has none, in the West Turkey Creek watershed, Southeast Arizona, USA. SWAT was calibrated for streamflow using discharge documented during the summer of 2013 at the Control site. Model results depict the necessity to eliminate lateral flow from SWAT models of aridland environments, the urgency to standardize geospatial soils data, and the care for which modelers must document altering parameters when presenting findings. Performance was assessed using the percent bias (PBIAS), with values of ±2.34%. The calibrated model was then used to examine the impacts of check dams at the Treated watershed. Approximately 630 tons of sediment is estimated to be stored behind check dams in the Treated watershed over the 3-year simulation, increasing water quality for fish habitat. A minimum precipitation event of 15 mm was necessary to instigate the detachment of soil, sediments, or rock from the study area, which occurred 2% of the time. The resulting watershed model is useful as a predictive framework and decision-support tool to consider long-term impacts of restoration and potential for future restoration.

  12. Universal Free School Breakfast: A Qualitative Model for Breakfast Behaviors

    PubMed Central

    Harvey-Golding, Louise; Donkin, Lynn Margaret; Blackledge, John; Defeyter, Margaret Anne

    2015-01-01

    In recent years, the provision of school breakfast has increased significantly in the UK. However, research examining the effectiveness of school breakfast is still within relative stages of infancy, and findings to date have been rather mixed. Moreover, previous evaluations of school breakfast schemes have been predominantly quantitative in their methodologies. Currently, there are few qualitative studies examining the subjective perceptions and experiences of stakeholders, and thereby an absence of knowledge regarding the sociocultural impacts of school breakfast. The purpose of this study was to investigate the beliefs, views and attitudes, and breakfast consumption behaviors, among key stakeholders, served by a council-wide universal free school breakfast initiative, within the North West of England, UK. A sample of children, parents, and school staff were recruited from three primary schools, participating in the universal free school breakfast scheme, to partake in semi-structured interviews and small focus groups. A Grounded Theory analysis of the data collected identified a theoretical model of breakfast behaviors, underpinned by the subjective perceptions and experiences of these key stakeholders. The model comprises of three domains relating to breakfast behaviors, and the internal and external factors that are perceived to influence breakfast behaviors, among children, parents, and school staff. Findings were validated using triangulation methods, member checks, and inter-rater reliability measures. In presenting this theoretically grounded model for breakfast behaviors, this paper provides a unique qualitative insight into the breakfast consumption behaviors and barriers to breakfast consumption, within a socioeconomically deprived community, participating in a universal free school breakfast intervention program. PMID:26125017

  13. New Results in Software Model Checking and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  14. Psychometric evaluation of Persian Nomophobia Questionnaire: Differential item functioning and measurement invariance across gender.

    PubMed

    Lin, Chung-Ying; Griffiths, Mark D; Pakpour, Amir H

    2018-03-01

    Background and aims Research examining problematic mobile phone use has increased markedly over the past 5 years and has been related to "no mobile phone phobia" (so-called nomophobia). The 20-item Nomophobia Questionnaire (NMP-Q) is the only instrument that assesses nomophobia with an underlying theoretical structure and robust psychometric testing. This study aimed to confirm the construct validity of the Persian NMP-Q using Rasch and confirmatory factor analysis (CFA) models. Methods After ensuring the linguistic validity, Rasch models were used to examine the unidimensionality of each Persian NMP-Q factor among 3,216 Iranian adolescents and CFAs were used to confirm its four-factor structure. Differential item functioning (DIF) and multigroup CFA were used to examine whether males and females interpreted the NMP-Q similarly, including item content and NMP-Q structure. Results Each factor was unidimensional according to the Rach findings, and the four-factor structure was supported by CFA. Two items did not quite fit the Rasch models (Item 14: "I would be nervous because I could not know if someone had tried to get a hold of me;" Item 9: "If I could not check my smartphone for a while, I would feel a desire to check it"). No DIF items were found across gender and measurement invariance was supported in multigroup CFA across gender. Conclusions Due to the satisfactory psychometric properties, it is concluded that the Persian NMP-Q can be used to assess nomophobia among adolescents. Moreover, NMP-Q users may compare its scores between genders in the knowledge that there are no score differences contributed by different understandings of NMP-Q items.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaks, D; Fletcher, R; Salamon, S

    Purpose: To develop an online framework that tracks a patient’s plan from initial simulation to treatment and that helps automate elements of the physics plan checks usually performed in the record and verify (RV) system and treatment planning system. Methods: We have developed PlanTracker, an online plan tracking system that automatically imports new patients tasks and follows it through treatment planning, physics checks, therapy check, and chart rounds. A survey was designed to collect information about the amount of time spent by medical physicists in non-physics related tasks. We then assessed these non-physics tasks for automation. Using these surveys, wemore » directed our PlanTracker software development towards the automation of intra-plan physics review. We then conducted a systematic evaluation of PlanTracker’s accuracy by generating test plans in the RV system software designed to mimic real plans, in order to test its efficacy in catching errors both real and theoretical. Results: PlanTracker has proven to be an effective improvement to the clinical workflow in a radiotherapy clinic. We present data indicating that roughly 1/3 of the physics plan check can be automated, and the workflow optimized, and show the functionality of PlanTracker. When the full system is in clinical use we will present data on improvement of time use in comparison to survey data prior to PlanTracker implementation. Conclusion: We have developed a framework for plan tracking and automatic checks in radiation therapy. We anticipate using PlanTracker as a basis for further development in clinical/research software. We hope that by eliminating the most simple and time consuming checks, medical physicists may be able to spend their time on plan quality and other physics tasks rather than in arithmetic and logic checks. We see this development as part of a broader initiative to advance the clinical/research informatics infrastructure surrounding the radiotherapy clinic. This research project has been financially supported by Varian Medical Systems, Palo Alto, CA, through a Varian MRA.« less

  16. Alternative Interpretation of Low-Energy Nuclear Reaction Processes with Deuterated Metals Based on the Bose-Einstein Condensation Mechanism

    NASA Astrophysics Data System (ADS)

    Kim, Yeong E.; Passell, Thomas O.

    2006-02-01

    Recently, a generalization of the Bose-Einstein condensation (BEC) mechanism has been made to a ground-state mixture of two different species of positively charged bosons in harmonic traps. The theory has been used to describe (D + Li) reactions in the low energy nuclear reaction (LENR) processes in condensed matter and predicts that the (D + Li) reaction rates can be larger than (D + D) reaction rates by as much as a factor of ~50, implying that (D + Li) reactions may be occuring in addition to the (D + D) reactions. A survey of the existing data from LENR experiments is carried out to check the validity of the theoretical prediction. We conclude that there is compelling experimental evidence which support the theoretical prediction. New experimental tests of the theoretical prediction are suggested.

  17. N-Sulfinylimine compounds, R-NSO: a chemistry family with strong temperament

    NASA Astrophysics Data System (ADS)

    Romano, R. M.; Della Védova, C. O.

    2000-04-01

    In this review, an update on the structural properties and theoretical studies of N-sulfinylimine compounds (R-NSO) is reported. They were deduced using several experimental techniques: gas-electron diffraction (GED), X-ray diffraction, 17O NMR, ultraviolet-visible absorption spectroscopy (UV-Vis), FTIR (including matrix studies of molecular randomisation) and Raman (including pre-resonant Raman spectra). Data are compared with those obtained by theoretical calculations. With these tools, excited state geometry using the time-dependent theory was calculated for these kinds of compounds. The existence of pre-resonant Raman effect was reported recently for R-NSO compounds. The configuration of R-NSO compounds was checked for this series confirming the existence of only one syn configuration. This finding is corroborated by theoretical calculations. The method of preparation is also summarised.

  18. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  19. COED Transactions, Vol. X, No. 6, June 1978. Concentric-Tube Heat Exchanger Analysis and Data Reduction.

    ERIC Educational Resources Information Center

    Marcovitz, Alan B., Ed.

    Four computer programs written in FORTRAN and BASIC develop theoretical predictions and data reduction for a junior-senior level heat exchanger experiment. Programs may be used at the terminal in the laboratory to check progress of the experiment or may be used in the batch mode for interpretation of final information for a formal report. Several…

  20. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  1. Clauser-Horne-Shimony-Holt versus three-party pseudo-telepathy: on the optimal number of samples in device-independent quantum private query

    NASA Astrophysics Data System (ADS)

    Basak, Jyotirmoy; Maitra, Subhamoy

    2018-04-01

    In device-independent (DI) paradigm, the trustful assumptions over the devices are removed and CHSH test is performed to check the functionality of the devices toward certifying the security of the protocol. The existing DI protocols consider infinite number of samples from theoretical point of view, though this is not practically implementable. For finite sample analysis of the existing DI protocols, we may also consider strategies for checking device independence other than the CHSH test. In this direction, here we present a comparative analysis between CHSH and three-party Pseudo-telepathy game for the quantum private query protocol in DI paradigm that appeared in Maitra et al. (Phys Rev A 95:042344, 2017) very recently.

  2. Numerical simulation of fiber interaction in short-fiber injection-molded composite using different cavity geometries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thi, Thanh Binh Nguyen, E-mail: nttbinh@kit.ac.jp; Yokoyama, Atsushi, E-mail: yokoyama@kit.ac.jp; Hamanaka, Senji

    The theoretical fiber-interaction model for calculating the fiber orientation in the injection molded short fiber/thermoplastic composite parts was proposed. The proposed model included the fiber dynamics simulation in order to obtain an equation of the global interaction coefficient and accurate estimate of the fiber interacts at all orientation states. The steps to derive the equation for this coefficient in short fiber suspension as a function of the fiber aspect ratio, volume fraction and general shear rate are delineated. Simultaneously, the high-resolution 3D X-ray computed tomography system XVA-160α was used to observe fiber distribution of short-glass-fiber-reinforced polyamide specimens using different cavitymore » geometries. The fiber orientation tensor components are then calculated. Experimental orientation measurements of short-glass-fiber-reinforced polyamide is used to check the ability of present theory for predicting orientation. The experiments and predictions show a quantitative agreement and confirm the basic understanding of fiber orientation in injection-molded composites.« less

  3. Evolution of Nano-structured Quasicrystals from Amorphous alloys

    NASA Astrophysics Data System (ADS)

    Xing, L. Q.; Kelton, K. F.

    2002-03-01

    Ta shows a significant effect on the precipitation of quasicrystals in (Zr_1-xTa_x)_64Cu_18Ni_8Al_10 amorphous alloys. The amorphous alloy made without Ta forms precipitates of tetragonal Zr_2Cu primary phases upon annealing. The addition of a small amount of Ta ( ~ 3 at%) to the alloy initiates the precipitation of primary icosahedral quasicrystal phases. Moreover, as the Ta concentration increases, the size of the precipitates decreases dramatically. To study the effect of Ta in this alloy system and to understand the mechanism for the precipitation of nano-structured quasicrystals, we have investigated the crystallization characteristics of the alloys made with different Ta concentration using DSC, checked the structures of the annealed samples with TEM and X-ray diffraction, and analyzed the kinetics of the crystallization processes. The kinetic parameter and the measured crystal size distribution will be compared with theoretical predictions from conventional nucleation and growth model and from a new model for nucleation that couples the long-range diffusion flux with the interfacial attachment processes.

  4. Analytical and experimental investigation on transmission loss of clamped double panels: implication of boundary effects.

    PubMed

    Xin, F X; Lu, T J

    2009-03-01

    The air-borne sound insulation performance of a rectangular double-panel partition clamp mounted on an infinite acoustic rigid baffle is investigated both analytically and experimentally and compared with that of a simply supported one. With the clamped (or simply supported) boundary accounted for by using the method of modal function, a double series solution for the sound transmission loss (STL) of the structure is obtained by employing the weighted residual (Galerkin) method. Experimental measurements with Al double-panel partitions having air cavity are subsequently carried out to validate the theoretical model for both types of the boundary condition, and good overall agreement is achieved. A consistency check of the two different models (based separately on clamped modal function and simply supported modal function) is performed by extending the panel dimensions to infinite where no boundaries exist. The significant discrepancies between the two different boundary conditions are demonstrated in terms of the STL versus frequency plots as well as the panel deflection mode shapes.

  5. Probing relativistic effects in the central engine of AGN

    NASA Astrophysics Data System (ADS)

    Sanfrutos, M.; Miniutti, G.

    2017-03-01

    Active Galactic Nuclei (AGN) are perfect laboratories to check General Relativity (GR) effects by using Broad Line Region (BLR) clouds eclipses to probe the innermost regions of the accretion disk. A new relativistic X-ray spectral model for X-ray eclipses is introduced. First we present the different observables that are involved in X-ray eclipses, including the X-ray emitting regions size, the emissivity index, the cloud's column density, ionization, size and velocity, the black hole spin, and the system's inclination. Then we highlight some theoretical predictions on the observables by using XMM-Newton simulations, finding that absorption varies depending on the photons' energy range, being maximum when the approaching side of the X-ray-emitting region is covered. Finally, we fit our relativistic model to actual XMM-Newton data from a long observation of the NLS1 galaxy SWIFT J2127.4+5654, and compare our results with a previous work, in which we addressed the BLR cloud eclipse from a non-relativistic prespective.

  6. Numerical simulation of fiber interaction in short-fiber injection-molded composite using different cavity geometries

    NASA Astrophysics Data System (ADS)

    Thi, Thanh Binh Nguyen; Yokoyama, Atsushi; Hamanaka, Senji; Yamashita, Katsuhisa; Nonomura, Chisato

    2016-03-01

    The theoretical fiber-interaction model for calculating the fiber orientation in the injection molded short fiber/thermoplastic composite parts was proposed. The proposed model included the fiber dynamics simulation in order to obtain an equation of the global interaction coefficient and accurate estimate of the fiber interacts at all orientation states. The steps to derive the equation for this coefficient in short fiber suspension as a function of the fiber aspect ratio, volume fraction and general shear rate are delineated. Simultaneously, the high-resolution 3D X-ray computed tomography system XVA-160α was used to observe fiber distribution of short-glass-fiber-reinforced polyamide specimens using different cavity geometries. The fiber orientation tensor components are then calculated. Experimental orientation measurements of short-glass-fiber-reinforced polyamide is used to check the ability of present theory for predicting orientation. The experiments and predictions show a quantitative agreement and confirm the basic understanding of fiber orientation in injection-molded composites.

  7. Experimental and theoretical investigation of radiation and dynamics properties in laser-produced carbon plasmas

    NASA Astrophysics Data System (ADS)

    Min, Qi; Su, Maogen; Wang, Bo; Cao, Shiquan; Sun, Duixiong; Dong, Chenzhong

    2018-05-01

    The radiation and dynamics properties of laser-produced carbon plasma in vacuum were studied experimentally with aid of a spatio-temporally resolved emission spectroscopy technique. In addition, a radiation hydrodynamics model based on the fluid dynamic equations and the radiative transfer equation was presented, and calculation of the charge states was performed within the time-dependent collisional radiative model. Detailed temporal and spatial evolution behavior about plasma parameters have been analyzed, such as velocity, electron temperature, charge state distribution, energy level population, and various atomic processes. At the same time, the effects of different atomic processes on the charge state distribution were examined. Finally, the validity of assuming a local thermodynamic equilibrium in the carbon plasma expansion was checked, and the results clearly indicate that the assumption was valid only at the initial (<80 ns) stage of plasma expansion. At longer delay times, it was not applicable near the plasma boundary because of a sharp drop of plasma temperature and electron density.

  8. Temporal Precedence Checking for Switched Models and its Application to a Parallel Landing Protocol

    NASA Technical Reports Server (NTRS)

    Duggirala, Parasara Sridhar; Wang, Le; Mitra, Sayan; Viswanathan, Mahesh; Munoz, Cesar A.

    2014-01-01

    This paper presents an algorithm for checking temporal precedence properties of nonlinear switched systems. This class of properties subsume bounded safety and capture requirements about visiting a sequence of predicates within given time intervals. The algorithm handles nonlinear predicates that arise from dynamics-based predictions used in alerting protocols for state-of-the-art transportation systems. It is sound and complete for nonlinear switch systems that robustly satisfy the given property. The algorithm is implemented in the Compare Execute Check Engine (C2E2) using validated simulations. As a case study, a simplified model of an alerting system for closely spaced parallel runways is considered. The proposed approach is applied to this model to check safety properties of the alerting logic for different operating conditions such as initial velocities, bank angles, aircraft longitudinal separation, and runway separation.

  9. The method of a joint intraday security check system based on cloud computing

    NASA Astrophysics Data System (ADS)

    Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng

    2017-01-01

    The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.

  10. 76 FR 35344 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing... specified products. The MCAI states: During Landing Gear retraction/extension ground checks performed on the... airworthiness information (MCAI) states: During Landing Gear retraction/extension ground checks performed on the...

  11. Instantaneous charge state of uranium projectiles in fully ionized plasmas from energy loss experiments

    NASA Astrophysics Data System (ADS)

    Morales, Roberto; Barriga-Carrasco, Manuel D.; Casas, David

    2017-04-01

    The instantaneous charge state of uranium ions traveling through a fully ionized hydrogen plasma has been theoretically studied and compared with one of the first energy loss experiments in plasmas, carried out at GSI-Darmstadt by Hoffmann et al. in the 1990s. For this purpose, two different methods to estimate the instantaneous charge state of the projectile have been employed: (1) rate equations using ionization and recombination cross sections and (2) equilibrium charge state formulas for plasmas. Also, the equilibrium charge state has been obtained using these ionization and recombination cross sections and compared with the former equilibrium formulas. The equilibrium charge state of projectiles in plasmas is not always reached, and it depends mainly on the projectile velocity and the plasma density. Therefore, a non-equilibrium or an instantaneous description of the projectile charge is necessary. The charge state of projectile ions cannot be measured, except after exiting the target, and experimental data remain very scarce. Thus, the validity of our charge state model is checked by comparing the theoretical predictions with an energy loss experiment, as the energy loss has a generally quadratic dependence on the projectile charge state. The dielectric formalism has been used to calculate the plasma stopping power including the Brandt-Kitagawa (BK) model to describe the charge distribution of the projectile. In this charge distribution, the instantaneous number of bound electrons instead of the equilibrium number has been taken into account. Comparing our theoretical predictions with experiments, it is shown the necessity of including the instantaneous charge state and the BK charge distribution for a correct energy loss estimation. The results also show that the initial charge state has a strong influence in order to estimate the energy loss of the uranium ions.

  12. Design and performance investigation of LDPC-coded upstream transmission systems in IM/DD OFDM-PONs

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoxue; Guo, Lei; Wu, Jingjing; Ning, Zhaolong

    2016-12-01

    In Intensity-Modulation Direct-Detection (IM/DD) Orthogonal Frequency Division Multiplexing Passive Optical Networks (OFDM-PONs), aside from Subcarrier-to-Subcarrier Intermixing Interferences (SSII) induced by square-law detection, the same laser frequency for data sending from Optical Network Units (ONUs) results in ONU-to-ONU Beating Interferences (OOBI) at the receiver. To mitigate those interferences, we design a Low-Density Parity Check (LDPC)-coded and spectrum-efficient upstream transmission system. A theoretical channel model is also derived, in order to analyze the detrimental factors influencing system performances. Simulation results demonstrate that the receiver sensitivity is improved 3.4 dB and 2.5 dB under QPSK and 8QAM, respectively, after 100 km Standard Single-Mode Fiber (SSMF) transmission. Furthermore, the spectrum efficiency can be improved by about 50%.

  13. Numerical and Experimental Case Study of Blasting Works Effect

    NASA Astrophysics Data System (ADS)

    Papán, Daniel; Valašková, Veronika; Drusa, Marian

    2016-10-01

    This article introduces the theoretical and experimental case study of dynamic monitoring of the geological environment above constructed highway tunnel. The monitored structure is in this case a very important water supply pipeline, which crosses the tunnel and was made from steel tubes with a diameter of 800 mm. The basic dynamic parameters had been monitored during blasting works, and were compared with the FEM (Finite Element Method) calculations and checked by the Slovak standard limits. A calibrated FEM model based on the experimental measurement data results was created and used in order to receive more realistic results in further predictions, time and space extrapolations. This case study was required and demanded by the general contractor company and also by the owner of water pipeline, and it was an answer of public safety evaluation of risks during tunnel construction.

  14. Dynamical properties of water in living cells

    NASA Astrophysics Data System (ADS)

    Piazza, Irina; Cupane, Antonio; Barbier, Emmanuel L.; Rome, Claire; Collomb, Nora; Ollivier, Jacques; Gonzalez, Miguel A.; Natali, Francesca

    2018-02-01

    With the aim of studying the effect of water dynamics on the properties of biological systems, in this paper, we present a quasi-elastic neutron scattering study on three different types of living cells, differing both in their morphological and tumor properties. The measured scattering signal, which essentially originates from hydrogen atoms present in the investigated systems, has been analyzed using a global fitting strategy using an optimized theoretical model that considers various classes of hydrogen atoms and allows disentangling diffusive and rotational motions. The approach has been carefully validated by checking the reliability of the calculation of parameters and their 99% confidence intervals. We demonstrate that quasi-elastic neutron scattering is a suitable experimental technique to characterize the dynamics of intracellular water in the angstrom/picosecond space/time scale and to investigate the effect of water dynamics on cellular biodiversity.

  15. Inspection of aeronautical mechanical parts with a pan-tilt-zoom camera: an approach guided by the computer-aided design model

    NASA Astrophysics Data System (ADS)

    Viana, Ilisio; Orteu, Jean-José; Cornille, Nicolas; Bugarin, Florian

    2015-11-01

    We focus on quality control of mechanical parts in aeronautical context using a single pan-tilt-zoom (PTZ) camera and a computer-aided design (CAD) model of the mechanical part. We use the CAD model to create a theoretical image of the element to be checked, which is further matched with the sensed image of the element to be inspected, using a graph theory-based approach. The matching is carried out in two stages. First, the two images are used to create two attributed graphs representing the primitives (ellipses and line segments) in the images. In the second stage, the graphs are matched using a similarity function built from the primitive parameters. The similarity scores of the matching are injected in the edges of a bipartite graph. A best-match-search procedure in the bipartite graph guarantees the uniqueness of the match solution. The method achieves promising performance in tests with synthetic data including missing elements, displaced elements, size changes, and combinations of these cases. The results open good prospects for using the method with realistic data.

  16. An Algorithm for Interactive Modeling of Space-Transportation Engine Simulations: A Constraint Satisfaction Approach

    NASA Technical Reports Server (NTRS)

    Mitra, Debasis; Thomas, Ajai; Hemminger, Joseph; Sakowski, Barbara

    2001-01-01

    In this research we have developed an algorithm for the purpose of constraint processing by utilizing relational algebraic operators. Van Beek and others have investigated in the past this type of constraint processing from within a relational algebraic framework, producing some unique results. Apart from providing new theoretical angles, this approach also gives the opportunity to use the existing efficient implementations of relational database management systems as the underlying data structures for any relevant algorithm. Our algorithm here enhances that framework. The algorithm is quite general in its current form. Weak heuristics (like forward checking) developed within the Constraint-satisfaction problem (CSP) area could be also plugged easily within this algorithm for further enhancements of efficiency. The algorithm as developed here is targeted toward a component-oriented modeling problem that we are currently working on, namely, the problem of interactive modeling for batch-simulation of engineering systems (IMBSES). However, it could be adopted for many other CSP problems as well. The research addresses the algorithm and many aspects of the problem IMBSES that we are currently handling.

  17. Sediment trapping efficiency of adjustable check dam in laboratory and field experiment

    NASA Astrophysics Data System (ADS)

    Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui

    2014-05-01

    Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.

  18. The dopamine D2/D3 receptor agonist quinpirole increases checking-like behaviour in an operant observing response task with uncertain reinforcement: a novel possible model of OCD.

    PubMed

    Eagle, Dawn M; Noschang, Cristie; d'Angelo, Laure-Sophie Camilla; Noble, Christie A; Day, Jacob O; Dongelmans, Marie Louise; Theobald, David E; Mar, Adam C; Urcelay, Gonzalo P; Morein-Zamir, Sharon; Robbins, Trevor W

    2014-05-01

    Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an 'observing' lever for information about the location of an 'active' lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballard, Sarah; Charbonneau, David; Fressin, Francois

    We present the validation and characterization of Kepler-61b: a 2.15 R{sub Circled-Plus} planet orbiting near the inner edge of the habitable zone of a low-mass star. Our characterization of the host star Kepler-61 is based upon a comparison with a set of spectroscopically similar stars with directly measured radii and temperatures. We apply a stellar prior drawn from the weighted mean of these properties, in tandem with the Kepler photometry, to infer a planetary radius for Kepler-61b of 2.15 {+-} 0.13 R{sub Circled-Plus} and an equilibrium temperature of 273 {+-} 13 K (given its period of 59.87756 {+-} 0.00020 daysmore » and assuming a planetary albedo of 0.3). The technique of leveraging the physical properties of nearby ''proxy'' stars allows for an independent check on stellar characterization via the traditional measurements with stellar spectra and evolutionary models. In this case, such a check had implications for the putative habitability of Kepler-61b: the planet is 10% warmer and larger than inferred from K-band spectral characterization. From the Kepler photometry, we estimate a stellar rotation period of 36 days, which implies a stellar age of >1 Gyr. We summarize the evidence for the planetary nature of the Kepler-61 transit signal, which we conclude is 30,000 times more likely to be due to a planet than a blend scenario. Finally, we discuss possible compositions for Kepler-61b with a comparison to theoretical models as well as to known exoplanets with similar radii and dynamically measured masses.« less

  20. An experimental method to verify soil conservation by check dams on the Loess Plateau, China.

    PubMed

    Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q

    2009-12-01

    A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.

  1. 75 FR 27406 - Airworthiness Directives; Bombardier, Inc. Model BD-100-1A10 (Challenger 300) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ... BD- 100 Time Limits/Maintenance Checks. The actions described in this service information are... Challenger 300 BD-100 Time Limits/Maintenance Checks. (1) For the new tasks identified in Bombardier TR 5-2... Requirements,'' in Part 2 of Chapter 5 of Bombardier Challenger 300 BD-100 Time Limits/ Maintenance Checks...

  2. 75 FR 66655 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... December 3, 2010 (the effective date of this AD), check the airplane maintenance records to determine if... of the airplane. Do this check following paragraph 3.A. of Pilatus Aircraft Ltd. PC-7 Service... maintenance records check required in paragraph (f)(1) of this AD or it is unclear whether or not the left and...

  3. 77 FR 20520 - Airworthiness Directives; Bombardier, Inc. Model BD-100-1A10 (Challenger 300) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ... Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual. For this task, the initial compliance..., of Part 2, of the Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual, the general.../Maintenance Checks Manual, provided that the relevant information in the general revision is identical to that...

  4. "I share, therefore I am": personality traits, life satisfaction, and Facebook check-ins.

    PubMed

    Wang, Shaojung Sharon

    2013-12-01

    This study explored whether agreeableness, extraversion, and openness function to influence self-disclosure behavior, which in turn impacts the intensity of checking in on Facebook. A complete path from extraversion to Facebook check-in through self-disclosure and sharing was found. The indirect effect from sharing to check-in intensity through life satisfaction was particularly salient. The central component of check-in is for users to disclose a specific location selectively that has implications on demonstrating their social lives, lifestyles, and tastes, enabling a selective and optimized self-image. Implications on the hyperpersonal model and warranting principle are discussed.

  5. Thermal Theory of Combustion and Explosion. 3; Theory of Normal Flame Propagation

    NASA Technical Reports Server (NTRS)

    Semenov, N. N.

    1942-01-01

    The technical memorandum covers experimental data on flame propagation, the velocity of flame propagation, analysis of the old theoretical views of flame propagation, confirmation of the theory for simple reactions (theory of combustion of explosive substances and in particular nitroglycol), and check of the theory by example of a chain oxidizing reaction (theory of flame propagation in carbon monoxide, air and carbon monoxide - oxygen mixtures).

  6. Chaos and Order in Weakly Coupled Systems of Nonlinear Oscillators

    NASA Astrophysics Data System (ADS)

    Bruhn, B.

    1987-01-01

    We consider in this paper perturbations of two degree of freedom Hamiltonian systems which contain periodic and heteroclinic orbits. The Melnikov-Keener condition is used to proof the existence of horseshoes in the dynamics. The same condition is applied to prove a high degree of order in the motion of the swinging Atwood's machine. For some selected parameter values the theoretical predictions are checked by numerical calculations.

  7. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  8. Concrete Model Checking with Abstract Matching and Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu Corina S.; Peianek Radek; Visser, Willem

    2005-01-01

    We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.

  9. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  10. Logic Model Checking of Unintended Acceleration Claims in the 2005 Toyota Camry Electronic Throttle Control System

    NASA Technical Reports Server (NTRS)

    Gamble, Ed; Holzmann, Gerard

    2011-01-01

    Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses

  11. Model selection and assessment for multi­-species occupancy models

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  12. 76 FR 53348 - Airworthiness Directives; BAE SYSTEMS (Operations) Limited Model BAe 146 Airplanes and Model Avro...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...

  13. 78 FR 40063 - Airworthiness Directives; Erickson Air-Crane Incorporated Helicopters (Type Certificate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-03

    ... Sikorsky Model S-64E helicopters. The AD requires repetitive checks of the Blade Inspection Method (BIM... and check procedures for BIM blades installed on the Model S-64F helicopters. Several blade spars with a crack emanating from corrosion pits and other damage have been found because of BIM pressure...

  14. A Simulated Annealing based Optimization Algorithm for Automatic Variogram Model Fitting

    NASA Astrophysics Data System (ADS)

    Soltani-Mohammadi, Saeed; Safa, Mohammad

    2016-09-01

    Fitting a theoretical model to an experimental variogram is an important issue in geostatistical studies because if the variogram model parameters are tainted with uncertainty, the latter will spread in the results of estimations and simulations. Although the most popular fitting method is fitting by eye, in some cases use is made of the automatic fitting method on the basis of putting together the geostatistical principles and optimization techniques to: 1) provide a basic model to improve fitting by eye, 2) fit a model to a large number of experimental variograms in a short time, and 3) incorporate the variogram related uncertainty in the model fitting. Effort has been made in this paper to improve the quality of the fitted model by improving the popular objective function (weighted least squares) in the automatic fitting. Also, since the variogram model function (£) and number of structures (m) too affect the model quality, a program has been provided in the MATLAB software that can present optimum nested variogram models using the simulated annealing method. Finally, to select the most desirable model from among the single/multi-structured fitted models, use has been made of the cross-validation method, and the best model has been introduced to the user as the output. In order to check the capability of the proposed objective function and the procedure, 3 case studies have been presented.

  15. Slicing AADL Specifications for Model Checking

    NASA Technical Reports Server (NTRS)

    Odenbrett, Maximilian; Nguyen, Viet Yen; Noll, Thomas

    2010-01-01

    To combat the state-space explosion problem in model checking larger systems, abstraction techniques can be employed. Here, methods that operate on the system specification before constructing its state space are preferable to those that try to minimize the resulting transition system as they generally reduce peak memory requirements. We sketch a slicing algorithm for system specifications written in (a variant of) the Architecture Analysis and Design Language (AADL). Given a specification and a property to be verified, it automatically removes those parts of the specification that are irrelevant for model checking the property, thus reducing the size of the corresponding transition system. The applicability and effectiveness of our approach is demonstrated by analyzing the state-space reduction for an example, employing a translator from AADL to Promela, the input language of the SPIN model checker.

  16. Synthetic biology between challenges and risks: suggestions for a model of governance and a regulatory framework, based on fundamental rights.

    PubMed

    Colussi, Ilaria Anna

    2013-01-01

    This paper deals with the emerging synthetic biology, its challenges and risks, and tries to design a model for the governance and regulation of the field. The model is called of "prudent vigilance" (inspired by the report about synthetic biology, drafted by the U.S. Presidential Commission on Bioethics, 2010), and it entails (a) an ongoing and periodically revised process of assessment and management of all the risks and concerns, and (b) the adoption of policies - taken through "hard law" and "soft law" sources - that are based on the principle of proportionality (among benefits and risks), on a reasonable balancing between different interests and rights at stake, and are oriented by a constitutional frame, which is represented by the protection of fundamental human rights emerging in the field of synthetic biology (right to life, right to health, dignity, freedom of scientific research, right to environment). After the theoretical explanation of the model, its operability is "checked", by considering its application with reference to only one specific risk brought up by synthetic biology - biosecurity risk, i.e. the risk of bioterrorism.

  17. Experimental studies by complementary terahertz techniques and semi-classical calculations of N2- broadening coefficients of CH335Cl

    NASA Astrophysics Data System (ADS)

    Guinet, M.; Rohart, F.; Buldyreva, J.; Gupta, V.; Eliet, S.; Motiyenko, R. A.; Margulès, L.; Cuisset, A.; Hindle, F.; Mouret, G.

    2012-07-01

    Room-temperature N2-broadening coefficients of methyl chloride rotational lines are measured over a large interval of quantum numbers (6≤J≤50, 0≤K≤18) by a submillimeter frequency-multiplication chain (J≤31) and a terahertz photomixing continuous-wave spectrometer (J≥31). In order to check the accuracy of both techniques, the measurements of identical lines are compared for J=31. The pressure broadening coefficients are deduced from line fits using mainly a Voigt profile model. The excellent signal-to-noise ratio of the frequency-multiplication scheme highlights some speed dependence effect on the line shape. Theoretical values of these coefficients are calculated by a semi-classical approach with exact trajectories. An intermolecular potential including atom-atom interactions is used for the first time. It is shown that, contrary to the previous theoretical predictions, the contributions of short-range forces are important for all values of the rotational quantum numbers. Additional testing of modifications required in the semi-classical formalism for a correct application of the cumulant expansion is also performed. It is stated that the use of the cumulant average on the rotational states of the perturbing molecule leads, for high J and small K values, to slightly higher line-broadening coefficients, as expected for the relatively strong interacting CH3Cl-N2 system. The excellent agreement between the theoretical and the experimental results ensures the reliability of these data.

  18. 75 FR 42585 - Airworthiness Directives; Empresa Brasileira de Aeronautica S.A. (EMBRAER) Model ERJ 170 and ERJ...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... (Low Stage Bleed Check Valve) specified in Section 1 of the EMBRAER 170 Maintenance Review Board Report...-11-02-002 (Low Stage Bleed Check Valve), specified in Section 1 of the EMBRAER 170 Maintenance Review... Task 36-11-02-002 (Low Stage Bleed Check Valve) specified in Section 1 of the EMBRAER 170 Maintenance...

  19. 75 FR 9816 - Airworthiness Directives; Empresa Brasileira de Aeronautica S.A. (EMBRAER) Model ERJ 170 and ERJ...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... maintenance plan to include repetitive functional tests of the low-stage check valve. For certain other... program to include maintenance Task Number 36-11-02- 002 (Low Stage Bleed Check Valve), specified in... Check Valve) in Section 1 of the EMBRAER 170 Maintenance Review Board Report MRB-1621. Issued in Renton...

  20. 75 FR 39811 - Airworthiness Directives; The Boeing Company Model 777 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-13

    ... Service Bulletin 777-57A0064, dated March 26, 2009, it is not necessary to perform the torque check on the... instructions in Boeing Alert Service Bulletin 777-57A0064, dated March 26, 2009, a torque check is redundant... are less than those for the torque check. Boeing notes that it plans to issue a new revision to this...

  1. Checking Dimensionality in Item Response Models with Principal Component Analysis on Standardized Residuals

    ERIC Educational Resources Information Center

    Chou, Yeh-Tai; Wang, Wen-Chung

    2010-01-01

    Dimensionality is an important assumption in item response theory (IRT). Principal component analysis on standardized residuals has been used to check dimensionality, especially under the family of Rasch models. It has been suggested that an eigenvalue greater than 1.5 for the first eigenvalue signifies a violation of unidimensionality when there…

  2. Stress analysis of 27% scale model of AH-64 main rotor hub

    NASA Technical Reports Server (NTRS)

    Hodges, R. V.

    1985-01-01

    Stress analysis of an AH-64 27% scale model rotor hub was performed. Component loads and stresses were calculated based upon blade root loads and motions. The static and fatigue analysis indicates positive margins of safety in all components checked. Using the format developed here, the hub can be stress checked for future application.

  3. Comparison of three controllers applied to helicopter vibration

    NASA Technical Reports Server (NTRS)

    Leyland, Jane A.

    1992-01-01

    A comparison was made of the applicability and suitability of the deterministic controller, the cautious controller, and the dual controller for the reduction of helicopter vibration by using higher harmonic blade pitch control. A randomly generated linear plant model was assumed and the performance index was defined to be a quadratic output metric of this linear plant. A computer code, designed to check out and evaluate these controllers, was implemented and used to accomplish this comparison. The effects of random measurement noise, the initial estimate of the plant matrix, and the plant matrix propagation rate were determined for each of the controllers. With few exceptions, the deterministic controller yielded the greatest vibration reduction (as characterized by the quadratic output metric) and operated with the greatest reliability. Theoretical limitations of these controllers were defined and appropriate candidate alternative methods, including one method particularly suitable to the cockpit, were identified.

  4. The relationship between familial resemblance and sexual attraction: an update on Westermarck, Freud, and the incest taboo.

    PubMed

    Lieberman, Debra; Fessler, Daniel M T; Smith, Adam

    2011-09-01

    Foundational principles of evolutionary theory predict that inbreeding avoidance mechanisms should exist in all species--including humans--in which close genetic relatives interact during periods of sexual maturity. Voluminous empirical evidence, derived from diverse taxa, supports this prediction. Despite such results, Fraley and Marks claim to provide evidence that humans are sexually attracted to close genetic relatives and that such attraction is held in check by cultural taboos. Here, the authors show that Fraley and Marks, in their search for an alternate explanation of inbreeding avoidance, misapply theoretical constructs from evolutionary biology and social psychology, leading to an incorrect interpretation of their results. The authors propose that Fraley and Marks's central findings can be explained in ways consistent with existing evolutionary models of inbreeding avoidance. The authors conclude that appropriate application of relevant theory and stringent experimental design can generate fruitful investigations into sexual attraction, inbreeding avoidance, and incest taboos.

  5. Plasma-induced magnetic responses during nonlinear dynamics of magnetic islands due to resonant magnetic perturbations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nishimura, Seiya, E-mail: n-seiya@kobe-kosen.ac.jp

    Resonant magnetic perturbations (RMPs) produce magnetic islands in toroidal plasmas. Self-healing (annihilation) of RMP-induced magnetic islands has been observed in helical systems, where a possible mechanism of the self-healing is shielding of RMP penetration by plasma flows, which is well known in tokamaks. Thus, fundamental physics of RMP shielding is commonly investigated in both tokamaks and helical systems. In order to check this mechanism, detailed informations of magnetic island phases are necessary. In experiments, measurement of radial magnetic responses is relatively easy. In this study, based on a theoretical model of rotating magnetic islands, behavior of radial magnetic fields duringmore » the self-healing is investigated. It is confirmed that flips of radial magnetic fields are typically observed during the self-healing. Such behavior of radial magnetic responses is also observed in LHD experiments.« less

  6. Model Checking a Self-Stabilizing Distributed Clock Synchronization Protocol for Arbitrary Digraphs

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2011-01-01

    This report presents the mechanical verification of a self-stabilizing distributed clock synchronization protocol for arbitrary digraphs in the absence of faults. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. The system under study is an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. This protocol deterministically converges within a time bound that is a linear function of the self-stabilization period.

  7. Coil motion effects in watt balances: a theoretical check

    NASA Astrophysics Data System (ADS)

    Li, Shisong; Schlamminger, Stephan; Haddad, Darine; Seifert, Frank; Chao, Leon; Pratt, Jon R.

    2016-04-01

    A watt balance is a precision apparatus for the measurement of the Planck constant that has been proposed as a primary method for realizing the unit of mass in a revised International System of Units. In contrast to an ampere balance, which was historically used to realize the unit of current in terms of the kilogram, the watt balance relates electrical and mechanical units through a virtual power measurement and has far greater precision. However, because the virtual power measurement requires the execution of a prescribed motion of a coil in a fixed magnetic field, systematic errors introduced by horizontal and rotational deviations of the coil from its prescribed path will compromise the accuracy. We model these potential errors using an analysis that accounts for the fringing field in the magnet, creating a framework for assessing the impact of this class of errors on the uncertainty of watt balance results.

  8. Predictability in cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

  9. Flow measurements in a water tunnel using a holocinematographic velocimeter

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M.; Beeler, George B.

    1987-01-01

    Dual-view holographic movies were used to examine complex flows with full three-space and time resolution. This approach, which tracks the movement of small tracer particles in water, is termed holocinematographic velocimetry (HCV). A small prototype of a new water tunnel was used to demonstrate proof-of-concept for the HCV. After utilizing a conventional flow visualization apparatus with a laser light sheet to illuminate tracer particles to evaluate flow quality of the prototype tunnel, a simplified version of the HCV was employed to demonstrate the capabilities of the approach. Results indicate that a full-scale version of the water tunnel and a high performance version of the HCV should be able to check theoretical and numerical modeling of complex flows and examine the mechanisms operative in turbulent and vortex flow control concepts, providing an entirely unique instrument capable, for the first time, of simultaneous three-space and time measurements in turbulent flow.

  10. Pay attention to your manipulation checks! Reward impact on cardiac reactivity is moderated by task context.

    PubMed

    Richter, Michael

    2010-05-01

    Two experiments assessed the moderating impact of task context on the relationship between reward and cardiovascular response. Randomly assigned to the cells of a 2 (task context: reward vs. demand) x 2 (reward value: low vs. high) between-persons design, participants performed either a memory task with an unclear performance standard (Experiment 1) or a visual scanning task with an unfixed performance standard (Experiment 2). Before performing the task--where participants could earn either a low or a high reward--participants responded to questions about either task reward or task demand. In accordance with the theoretical predictions derived from Wright's (1996) integrative model, reactivity of pre-ejection period increased with reward value if participants had rated aspects of task reward before performing the task. If they had rated task demand, pre-ejection period did not differ as a function of reward. Copyright 2010 Elsevier B.V. All rights reserved.

  11. Do alcohol compliance checks decrease underage sales at neighboring establishments?

    PubMed

    Erickson, Darin J; Smolenski, Derek J; Toomey, Traci L; Carlin, Bradley P; Wagenaar, Alexander C

    2013-11-01

    Underage alcohol compliance checks conducted by law enforcement agencies can reduce the likelihood of illegal alcohol sales at checked alcohol establishments, and theory suggests that an alcohol establishment that is checked may warn nearby establishments that compliance checks are being conducted in the area. In this study, we examined whether the effects of compliance checks diffuse to neighboring establishments. We used data from the Complying with the Minimum Drinking Age trial, which included more than 2,000 compliance checks conducted at more than 900 alcohol establishments. The primary outcome was the sale of alcohol to a pseudo-underage buyer without the need for age identification. A multilevel logistic regression was used to model the effect of a compliance check at each establishment as well as the effect of compliance checks at neighboring establishments within 500 m (stratified into four equal-radius concentric rings), after buyer, license, establishment, and community-level variables were controlled for. We observed a decrease in the likelihood of establishments selling alcohol to underage youth after they had been checked by law enforcement, but these effects quickly decayed over time. Establishments that had a close neighbor (within 125 m) checked in the past 90 days were also less likely to sell alcohol to young-appearing buyers. The spatial effect of compliance checks on other establishments decayed rapidly with increasing distance. Results confirm the hypothesis that the effects of police compliance checks do spill over to neighboring establishments. These findings have implications for the development of an optimal schedule of police compliance checks.

  12. Observational analysis on inflammatory reaction to talc pleurodesis: Small and large animal model series review

    PubMed Central

    Vannucci, Jacopo; Bellezza, Guido; Matricardi, Alberto; Moretti, Giulia; Bufalari, Antonello; Cagini, Lucio; Puma, Francesco; Daddi, Niccolò

    2018-01-01

    Talc pleurodesis has been associated with pleuropulmonary damage, particularly long-term damage due to its inert nature. The present model series review aimed to assess the safety of this procedure by examining inflammatory stimulus, biocompatibility and tissue reaction following talc pleurodesis. Talc slurry was performed in rabbits: 200 mg/kg checked at postoperative day 14 (five models), 200 mg/kg checked at postoperative day 28 (five models), 40 mg/kg, checked at postoperative day 14 (five models), 40 mg/kg checked at postoperative day 28 (five models). Talc poudrage was performed in pigs: 55 mg/kg checked at postoperative day 60 (18 models). Tissue inspection and data collection followed the surgical pathology approach currently used in clinical practice. As this was an observational study, no statistical analysis was performed. Regarding the rabbit model (Oryctolagus cunicoli), the extent of adhesions ranged between 0 and 30%, and between 0 and 10% following 14 and 28 days, respectively. No intraparenchymal granuloma was observed whereas, pleural granulomas were extensively encountered following both talc dosages, with more evidence of visceral pleura granulomas following 200 mg/kg compared with 40 mg/kg. Severe florid inflammation was observed in 2/10 cases following 40 mg/kg. Parathymic, pericardium granulomas and mediastinal lymphadenopathy were evidenced at 28 days. At 60 days, from rare adhesions to extended pleurodesis were observed in the pig model (Sus Scrofa domesticus). Pleural granulomas were ubiquitous on visceral and parietal pleurae. Severe spotted inflammation among the adhesions were recorded in 15/18 pigs. Intraparenchymal granulomas were observed in 9/18 lungs. Talc produced unpredictable pleurodesis in both animal models with enduring pleural inflammation whether it was performed via slurry or poudrage. Furthermore, talc appeared to have triggered extended pleural damage, intraparenchymal nodules (porcine poudrage) and mediastinal migration (rabbit slurry). PMID:29403549

  13. THE COSMIC RAY EQUATOR FROM DATA OF THE SECOND SOVIET EARTH SATELLITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savenko, I.A.; Shavrin, P.I.; Nesterov, V.Ye.

    1962-11-01

    Determination of the geographical position of the line of minimum intensity of primary cosmic radiation (cosmic ray equator) makes is possible to study the structure of the geomagnetic field and to check theoretical and empirical approximations to this field. The minima of cosmic radiation intensity were determined by the second Soviet spaceship for 22 latitude curves obtained from various crossings in the region of the geographical equator. (W.D.M.)

  14. Field Tests of Optical Instruments

    DTIC Science & Technology

    1947-03-15

    s > S3KS55Ü j.6),&;i.r..fc..’.w.~— * s1 Field Tests of Optical Instruments ^. (Not known) (Same) Bureau of Ordnance. Washington, D..D...a large-scale field test of optical instruments are described. The tests were instituted to check the correctness of theoretical considerations and...of laboratory tests -which have been v.sed in the selection and design of such instruments. Field con- ditions approximated as far as possible those

  15. Developing an approach for teaching and learning about Lewis structures

    NASA Astrophysics Data System (ADS)

    Kaufmann, Ilana; Hamza, Karim M.; Rundgren, Carl-Johan; Eriksson, Lars

    2017-08-01

    This study explores first-year university students' reasoning as they learn to draw Lewis structures. We also present a theoretical account of the formal procedure commonly taught for drawing these structures. Students' discussions during problem-solving activities were video recorded and detailed analyses of the discussions were made through the use of practical epistemology analysis (PEA). Our results show that the formal procedure was central for drawing Lewis structures, but its use varied depending on situational aspects. Commonly, the use of individual steps of the formal procedure was contingent on experiences of chemical structures, and other information such as the characteristics of the problem given. The analysis revealed a number of patterns in how students constructed, checked and modified the structure in relation to the formal procedure and the situational aspects. We suggest that explicitly teaching the formal procedure as a process of constructing, checking and modifying might be helpful for students learning to draw Lewis structures. By doing so, the students may learn to check the accuracy of the generated structure not only in relation to the octet rule and formal charge, but also to other experiences that are not explicitly included in the formal procedure.

  16. Analyzing Phylogenetic Trees with Timed and Probabilistic Model Checking: The Lactose Persistence Case Study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-12-01

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  17. Analyzing phylogenetic trees with timed and probabilistic model checking: the lactose persistence case study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-10-23

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  18. Model Checking for Verification of Interactive Health IT Systems

    PubMed Central

    Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui

    2015-01-01

    Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166

  19. A theoretical study of alpha star populations in loaded nuclear emulsions

    USGS Publications Warehouse

    Senftle, F.E.; Farley, T.A.; Stieff, L.R.

    1954-01-01

    This theoretical study of the alpha star populations in loaded emulsions was undertaken in an effort to find a quantitative method for the analysis of less than microgram amounts of thorium in the presence of larger amounts of uranium. Analytical expressions for each type of star from each of the significantly contributing members of the uranium and thorium series as well as summation formulas for the whole series have been computed. The analysis for thorium may be made by determining the abundance of five-branched stars in a loaded nuclear emulsion and comparing of observed and predicted star populations. The comparison may also be used to check the half-lives of several members of the uranium and thorium series. ?? 1954.

  20. Novice to expert practice via postprofessional athletic training education: a grounded theory.

    PubMed

    Neibert, Peter J

    2009-01-01

    To discover the theoretic constructs that confirm, disconfirm, or extend the principles and their applications appropriate for National Athletic Trainers' Association (NATA)-accredited postprofessional athletic training education programs. Interviews at the 2003 NATA Annual Meeting & Clinical Symposia. Qualitative study using grounded theory procedures. Thirteen interviews were conducted with postprofessional graduates. Participants were purposefully selected based on theoretic sampling and availability. The transcribed interviews were analyzed using open coding, axial coding, and selective coding procedures. Member checks, reflective journaling, and triangulation were used to ensure trustworthiness. The participants' comments confirmed and extended the current principles of postprofessional athletic training education programs and offered additional suggestions for more effective practical applications. The emergence of this central category of novice to expert practice is a paramount finding. The tightly woven fabric of the 10 processes, when interlaced with one another, provides a strong tapestry supporting novice to expert practice via postprofessional athletic training education. The emergence of this theoretic position pushes postprofessional graduate athletic training education forward to the future for further investigation into the theoretic constructs of novice to expert practice.

  1. Verification and Planning Based on Coinductive Logic Programming

    NASA Technical Reports Server (NTRS)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.

  2. Bayesian model checking: A comparison of tests

    NASA Astrophysics Data System (ADS)

    Lucy, L. B.

    2018-06-01

    Two procedures for checking Bayesian models are compared using a simple test problem based on the local Hubble expansion. Over four orders of magnitude, p-values derived from a global goodness-of-fit criterion for posterior probability density functions agree closely with posterior predictive p-values. The former can therefore serve as an effective proxy for the difficult-to-calculate posterior predictive p-values.

  3. Secure open cloud in data transmission using reference pattern and identity with enhanced remote privacy checking

    NASA Astrophysics Data System (ADS)

    Vijay Singh, Ran; Agilandeeswari, L.

    2017-11-01

    To handle the large amount of client’s data in open cloud lots of security issues need to be address. Client’s privacy should not be known to other group members without data owner’s valid permission. Sometime clients are fended to have accessing with open cloud servers due to some restrictions. To overcome the security issues and these restrictions related to storing, data sharing in an inter domain network and privacy checking, we propose a model in this paper which is based on an identity based cryptography in data transmission and intermediate entity which have client’s reference with identity that will take control handling of data transmission in an open cloud environment and an extended remote privacy checking technique which will work at admin side. On behalf of data owner’s authority this proposed model will give best options to have secure cryptography in data transmission and remote privacy checking either as private or public or instructed. The hardness of Computational Diffie-Hellman assumption algorithm for key exchange makes this proposed model more secure than existing models which are being used for public cloud environment.

  4. Monte Carlo simulations of lattice models for single polymer systems

    NASA Astrophysics Data System (ADS)

    Hsu, Hsiao-Ping

    2014-10-01

    Single linear polymer chains in dilute solutions under good solvent conditions are studied by Monte Carlo simulations with the pruned-enriched Rosenbluth method up to the chain length N ˜ O(10^4). Based on the standard simple cubic lattice model (SCLM) with fixed bond length and the bond fluctuation model (BFM) with bond lengths in a range between 2 and sqrt{10}, we investigate the conformations of polymer chains described by self-avoiding walks on the simple cubic lattice, and by random walks and non-reversible random walks in the absence of excluded volume interactions. In addition to flexible chains, we also extend our study to semiflexible chains for different stiffness controlled by a bending potential. The persistence lengths of chains extracted from the orientational correlations are estimated for all cases. We show that chains based on the BFM are more flexible than those based on the SCLM for a fixed bending energy. The microscopic differences between these two lattice models are discussed and the theoretical predictions of scaling laws given in the literature are checked and verified. Our simulations clarify that a different mapping ratio between the coarse-grained models and the atomistically realistic description of polymers is required in a coarse-graining approach due to the different crossovers to the asymptotic behavior.

  5. Magnetic interactions in a quasi-one-dimensional antiferromagnet Cu(H{sub 2}O){sub 2}(en)SO{sub 4}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sýkora, Rudolf, E-mail: rudolf.sykora@vsb.cz; Legut, Dominik

    A theoretical ab-initio investigation of exchange interaction between Cu atoms in an insulating antiferromagnet Cu(H{sub 2}O){sub 2}(en)SO{sub 4}, en = C{sub 2}H{sub 8}N{sub 2}, is reported. While the previous experimental studies described the system's magnetism to be quasi-two-dimensional, our results, based on a mapping of the system onto an effective Heisenberg model, rather support a quasi-one-dimensional character with the exchange coupling between the Cu atoms being propagated mainly along a zigzag line lying in the crystal's bc plane and connecting the Cu atoms through the N atoms. Further, the direction of magnetic moments on the Cu atoms is suggested to be nearlymore » along the crystal's a axis. A check of the change in the exchange constants induced either by external pressure or by various values of U in the GGA + U approximation is made. Finally, based on experimental values of positions of broad maxima in magnetic-susceptibility and specific-heat curves and using theoretical expressions available in the literature a relevant value of the U parameter and related expected value of the electronic gap are estimated to be about 5 eV and 2 eV, respectively.« less

  6. Standard deviation of the mean and other time series properties of voltages measured with a digital lock-in amplifier

    NASA Astrophysics Data System (ADS)

    Witt, Thomas J.; Fletcher, N. E.

    2010-10-01

    We investigate some statistical properties of ac voltages from a white noise source measured with a digital lock-in amplifier equipped with finite impulse response output filters which introduce correlations between successive voltage values. The main goal of this work is to propose simple solutions to account for correlations when calculating the standard deviation of the mean (SDM) for a sequence of measurement data acquired using such an instrument. The problem is treated by time series analysis based on a moving average model of the filtering process. Theoretical expressions are derived for the power spectral density (PSD), the autocorrelation function, the equivalent noise bandwidth and the Allan variance; all are related to the SDM. At most three parameters suffice to specify any of the above quantities: the filter time constant, the time between successive measurements (both set by the lock-in operator) and the PSD of the white noise input, h0. Our white noise source is a resistor so that the PSD is easily calculated; there are no free parameters. Theoretical expressions are checked against their respective sample estimates and, with the exception of two of the bandwidth estimates, agreement to within 11% or better is found.

  7. Experimental Evidence for Wigner’s Tunneling Time

    NASA Astrophysics Data System (ADS)

    Camus, N.; Yakaboylu, E.; Fechner, L.; Klaiber, M.; Laux, M.; Mi, Y.; Hatsagortsyan, K. Z.; Pfeifer, T.; Keitel, C. H.; Moshammer, R.

    2018-04-01

    Tunneling of a particle through a barrier is one of the counter-intuitive properties of quantum mechanical motion. Thanks to advances in the generation of strong laser fields, new opportunities to dynamically investigate this process have been developed. In the so-called attoclock measurements the electron’s properties after tunneling are mapped on its emission direction. We investigate the tunneling dynamics and achieve a high sensitivity thanks to two refinements of the attoclock principle. Using near-IR wavelength we place firmly the ionization process in the tunneling regime. Furthermore, we compare the electron momentum distributions of two atomic species of slightly different atomic potentials (argon and krypton) being ionized under absolutely identical conditions. Experimentally, using a reaction microscope, we succeed in measuring the 3D electron momentum distributions for both targets simultaneously. Theoretically, the time resolved description of tunneling in strong-field ionization is studied using the leading quantum-mechanical Wigner treatment. A detailed analysis of the most probable photoelectron emission for Ar and Kr allows testing the theoretical models and a sensitive check of the electron initial conditions at the tunnel exit. The agreement between experiment and theory provides a clear evidence for a non-zero tunneling time delay and a non-vanishing longitudinal momentum at this point.

  8. Spot-checks to measure general hygiene practice.

    PubMed

    Sonego, Ina L; Mosler, Hans-Joachim

    2016-01-01

    A variety of hygiene behaviors are fundamental to the prevention of diarrhea. We used spot-checks in a survey of 761 households in Burundi to examine whether something we could call general hygiene practice is responsible for more specific hygiene behaviors, ranging from handwashing to sweeping the floor. Using structural equation modeling, we showed that clusters of hygiene behavior, such as primary caregivers' cleanliness and household cleanliness, explained the spot-check findings well. Within our model, general hygiene practice as overall concept explained the more specific clusters of hygiene behavior well. Furthermore, the higher general hygiene practice, the more likely children were to be categorized healthy (r = 0.46). General hygiene practice was correlated with commitment to hygiene (r = 0.52), indicating a strong association to psychosocial determinants. The results show that different hygiene behaviors co-occur regularly. Using spot-checks, the general hygiene practice of a household can be rated quickly and easily.

  9. [The maximum heart rate in the exercise test: the 220-age formula or Sheffield's table?].

    PubMed

    Mesquita, A; Trabulo, M; Mendes, M; Viana, J F; Seabra-Gomes, R

    1996-02-01

    To determine in the maximum cardiac rate in exercise test of apparently healthy individuals may be more properly estimated through 220-age formula (Astrand) or the Sheffield table. Retrospective analysis of clinical history and exercises test of apparently healthy individuals submitted to cardiac check-up. Sequential sampling of 170 healthy individuals submitted to cardiac check-up between April 1988 and September 1992. Comparison of maximum cardiac rate of individuals studied by the protocols of Bruce and modified Bruce, in interrupted exercise test by fatigue, and with the estimated values by the formulae: 220-age versus Sheffield table. The maximum cardiac heart rate is similar with both protocols. This parameter in normal individuals is better predicted by the 220-age formula. The theoretic maximum cardiac heart rate determined by 220-age formula should be recommended for a healthy, and for this reason the Sheffield table has been excluded from our clinical practice.

  10. Secret information reconciliation based on punctured low-density parity-check codes for continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Jiang, Xue-Qin; Huang, Peng; Huang, Duan; Lin, Dakai; Zeng, Guihua

    2017-02-01

    Achieving information theoretic security with practical complexity is of great interest to continuous-variable quantum key distribution in the postprocessing procedure. In this paper, we propose a reconciliation scheme based on the punctured low-density parity-check (LDPC) codes. Compared to the well-known multidimensional reconciliation scheme, the present scheme has lower time complexity. Especially when the chosen punctured LDPC code achieves the Shannon capacity, the proposed reconciliation scheme can remove the information that has been leaked to an eavesdropper in the quantum transmission phase. Therefore, there is no information leaked to the eavesdropper after the reconciliation stage. This indicates that the privacy amplification algorithm of the postprocessing procedure is no more needed after the reconciliation process. These features lead to a higher secret key rate, optimal performance, and availability for the involved quantum key distribution scheme.

  11. Low-density parity-check codes for volume holographic memory systems.

    PubMed

    Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali

    2003-02-10

    We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.

  12. [Model for unplanned self extubation of ICU patients using system dynamics approach].

    PubMed

    Song, Yu Gil; Yun, Eun Kyoung

    2015-04-01

    In this study a system dynamics methodology was used to identify correlation and nonlinear feedback structure among factors affecting unplanned extubation (UE) of ICU patients and to construct and verify a simulation model. Factors affecting UE were identified through a theoretical background established by reviewing literature and preceding studies and referencing various statistical data. Related variables were decided through verification of content validity by an expert group. A causal loop diagram (CLD) was made based on the variables. Stock & Flow modeling using Vensim PLE Plus Version 6.0 b was performed to establish a model for UE. Based on the literature review and expert verification, 18 variables associated with UE were identified and CLD was prepared. From the prepared CLD, a model was developed by converting to the Stock & Flow Diagram. Results of the simulation showed that patient stress, patient in an agitated state, restraint application, patient movability, and individual intensive nursing were variables giving the greatest effect to UE probability. To verify agreement of the UE model with real situations, simulation with 5 cases was performed. Equation check and sensitivity analysis on TIME STEP were executed to validate model integrity. Results show that identification of a proper model enables prediction of UE probability. This prediction allows for adjustment of related factors, and provides basic data do develop nursing interventions to decrease UE.

  13. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  14. Evaluation of properties over phylogenetic trees using stochastic logics.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2016-06-14

    Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.

  15. Acoustic performance of inlet suppressors on an engine generating a single mode

    NASA Technical Reports Server (NTRS)

    Heidelberg, L. J.; Rice, E. J.; Homyak, L.

    1981-01-01

    Three single degree of freedom liners with different open area ratio face sheets were designed for a single spinning mode in order to evaluate an inlet suppressor design method based on mode cutoff ratio. This mode was generated by placing 41 rods in front of the 28 blade fan of a JT15D turbofan engine. At the liner design this near cutoff mode has a theoretical maximum attenuation of nearly 200 dB per L/D. The data show even higher attenuations at the design condition than predicted by the theory for dissipation of a single mode within the liner. This additional attenuation is large for high open area ratios and should be accounted for in the theory. The data show the additional attenuation to be inversely proportional to acoustic resistance. It was thought that the additional attenuation could be caused by reflection and modal scattering at the hard to soft wall interface. A reflection model was developed, and then modified to fit the data. This model was checked against independent (multiple pure tone) data with good agreement.

  16. The solution of private problems for optimization heat exchangers parameters

    NASA Astrophysics Data System (ADS)

    Melekhin, A.

    2017-11-01

    The relevance of the topic due to the decision of problems of the economy of resources in heating systems of buildings. To solve this problem we have developed an integrated method of research which allows solving tasks on optimization of parameters of heat exchangers. This method decides multicriteria optimization problem with the program nonlinear optimization on the basis of software with the introduction of an array of temperatures obtained using thermography. The author have developed a mathematical model of process of heat exchange in heat exchange surfaces of apparatuses with the solution of multicriteria optimization problem and check its adequacy to the experimental stand in the visualization of thermal fields, an optimal range of managed parameters influencing the process of heat exchange with minimal metal consumption and the maximum heat output fin heat exchanger, the regularities of heat exchange process with getting generalizing dependencies distribution of temperature on the heat-release surface of the heat exchanger vehicles, defined convergence of the results of research in the calculation on the basis of theoretical dependencies and solving mathematical model.

  17. Design optimization of beta- and photovoltaic conversion devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wichner, R.; Blum, A.; Fischer-Colbrie, E.

    1976-01-08

    This report presents the theoretical and experimental results of an LLL Electronics Engineering research program aimed at optimizing the design and electronic-material parameters of beta- and photovoltaic p-n junction conversion devices. To meet this objective, a comprehensive computer code has been developed that can handle a broad range of practical conditions. The physical model upon which the code is based is described first. Then, an example is given of a set of optimization calculations along with the resulting optimized efficiencies for silicon (Si) and gallium-arsenide (GaAs) devices. The model we have developed, however, is not limited to these materials. Itmore » can handle any appropriate material--single or polycrystalline-- provided energy absorption and electron-transport data are available. To check code validity, the performance of experimental silicon p-n junction devices (produced in-house) were measured under various light intensities and spectra as well as under tritium beta irradiation. The results of these tests were then compared with predicted results based on the known or best estimated device parameters. The comparison showed very good agreement between the calculated and the measured results.« less

  18. The experimental verification on the shear bearing capacity of exposed steel column foot

    NASA Astrophysics Data System (ADS)

    Xijin, LIU

    2017-04-01

    In terms of the shear bearing capacity of the exposed steel column foot, there are many researches both home and abroad. However, the majority of the researches are limited to the theoretical analysis sector and few of them make the experimental analysis. In accordance with the prototype of an industrial plant in Beijing, this paper designs the experimental model. The experimental model is composed of six steel structural members in two groups, with three members without shear key and three members with shear key. The paper checks the shear bearing capacity of two groups respectively under different axial forces. The experiment shows: The anchor bolt of the exposed steel column foot features relatively large shear bearing capacity which could not be neglected. The results deducted through calculation methods proposed by this paper under two situations match the experimental results in terms of the shear bearing capacity of the steel column foot. Besides, it also proposed suggestions on revising the Code for Design of Steel Structure in the aspect of setting the shear key in the steel column foot.

  19. Dynamic scaling in natural swarms

    NASA Astrophysics Data System (ADS)

    Cavagna, Andrea; Conti, Daniele; Creato, Chiara; Del Castello, Lorenzo; Giardina, Irene; Grigera, Tomas S.; Melillo, Stefania; Parisi, Leonardo; Viale, Massimiliano

    2017-09-01

    Collective behaviour in biological systems presents theoretical challenges beyond the borders of classical statistical physics. The lack of concepts such as scaling and renormalization is particularly problematic, as it forces us to negotiate details whose relevance is often hard to assess. In an attempt to improve this situation, we present here experimental evidence of the emergence of dynamic scaling laws in natural swarms of midges. We find that spatio-temporal correlation functions in different swarms can be rescaled by using a single characteristic time, which grows with the correlation length with a dynamical critical exponent z ~ 1, a value not found in any other standard statistical model. To check whether out-of-equilibrium effects may be responsible for this anomalous exponent, we run simulations of the simplest model of self-propelled particles and find z ~ 2, suggesting that natural swarms belong to a novel dynamic universality class. This conclusion is strengthened by experimental evidence of the presence of non-dissipative modes in the relaxation, indicating that previously overlooked inertial effects are needed to describe swarm dynamics. The absence of a purely dissipative regime suggests that natural swarms undergo a near-critical censorship of hydrodynamics.

  20. How parents process child health and nutrition information: A grounded theory model.

    PubMed

    Lovell, Jennifer L

    2016-02-01

    The aim of the present study was to investigate low-income parents' experiences receiving, making meaning of, and applying sociocultural messages about childhood health and nutrition. Semi-structured interviews were conducted with parents from 16 low-income Early Head Start families. Verbatim interview transcripts, observations, field notes, documentary evidence, and follow-up participant checks were used during grounded theory analysis of the data. Data yielded a potential theoretical model of parental movement toward action involving (a) the culture and context influencing parents, (b) parents' sources of social and cultural messages, (c) parental values and engagement, (d) parental motivation for action, (e) intervening conditions impacting motivation and application, and (f) parent action taken on the individual and social levels. Parent characteristics greatly impacted the ways in which parents understood and applied health and nutrition information. Among other implications, it is recommended that educators and providers focus on a parent's beliefs, values, and cultural preferences regarding food and health behaviors as well as his/her personal/family definition of "health" when framing recommendations and developing interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Capturing spiral radial growth of conifers using the superellipse to model tree-ring geometric shape

    PubMed Central

    Shi, Pei-Jian; Huang, Jian-Guo; Hui, Cang; Grissino-Mayer, Henri D.; Tardif, Jacques C.; Zhai, Li-Hong; Wang, Fu-Sheng; Li, Bai-Lian

    2015-01-01

    Tree-rings are often assumed to approximate a circular shape when estimating forest productivity and carbon dynamics. However, tree rings are rarely, if ever, circular, thereby possibly resulting in under- or over-estimation in forest productivity and carbon sequestration. Given the crucial role played by tree ring data in assessing forest productivity and carbon storage within a context of global change, it is particularly important that mathematical models adequately render cross-sectional area increment derived from tree rings. We modeled the geometric shape of tree rings using the superellipse equation and checked its validation based on the theoretical simulation and six actual cross sections collected from three conifers. We found that the superellipse better describes the geometric shape of tree rings than the circle commonly used. We showed that a spiral growth trend exists on the radial section over time, which might be closely related to spiral grain along the longitudinal axis. The superellipse generally had higher accuracy than the circle in predicting the basal area increment, resulting in an improved estimate for the basal area. The superellipse may allow better assessing forest productivity and carbon storage in terrestrial forest ecosystems. PMID:26528316

  2. Capturing spiral radial growth of conifers using the superellipse to model tree-ring geometric shape.

    PubMed

    Shi, Pei-Jian; Huang, Jian-Guo; Hui, Cang; Grissino-Mayer, Henri D; Tardif, Jacques C; Zhai, Li-Hong; Wang, Fu-Sheng; Li, Bai-Lian

    2015-01-01

    Tree-rings are often assumed to approximate a circular shape when estimating forest productivity and carbon dynamics. However, tree rings are rarely, if ever, circular, thereby possibly resulting in under- or over-estimation in forest productivity and carbon sequestration. Given the crucial role played by tree ring data in assessing forest productivity and carbon storage within a context of global change, it is particularly important that mathematical models adequately render cross-sectional area increment derived from tree rings. We modeled the geometric shape of tree rings using the superellipse equation and checked its validation based on the theoretical simulation and six actual cross sections collected from three conifers. We found that the superellipse better describes the geometric shape of tree rings than the circle commonly used. We showed that a spiral growth trend exists on the radial section over time, which might be closely related to spiral grain along the longitudinal axis. The superellipse generally had higher accuracy than the circle in predicting the basal area increment, resulting in an improved estimate for the basal area. The superellipse may allow better assessing forest productivity and carbon storage in terrestrial forest ecosystems.

  3. Probing the Cosmological Principle in the counts of radio galaxies at different frequencies

    NASA Astrophysics Data System (ADS)

    Bengaly, Carlos A. P.; Maartens, Roy; Santos, Mario G.

    2018-04-01

    According to the Cosmological Principle, the matter distribution on very large scales should have a kinematic dipole that is aligned with that of the CMB. We determine the dipole anisotropy in the number counts of two all-sky surveys of radio galaxies. For the first time, this analysis is presented for the TGSS survey, allowing us to check consistency of the radio dipole at low and high frequencies by comparing the results with the well-known NVSS survey. We match the flux thresholds of the catalogues, with flux limits chosen to minimise systematics, and adopt a strict masking scheme. We find dipole directions that are in good agreement with each other and with the CMB dipole. In order to compare the amplitude of the dipoles with theoretical predictions, we produce sets of lognormal realisations. Our realisations include the theoretical kinematic dipole, galaxy clustering, Poisson noise, simulated redshift distributions which fit the NVSS and TGSS source counts, and errors in flux calibration. The measured dipole for NVSS is ~2 times larger than predicted by the mock data. For TGSS, the dipole is almost ~ 5 times larger than predicted, even after checking for completeness and taking account of errors in source fluxes and in flux calibration. Further work is required to understand the nature of the systematics that are the likely cause of the anomalously large TGSS dipole amplitude.

  4. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  5. Philosophy and the practice of Bayesian statistics

    PubMed Central

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2015-01-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575

  6. Philosophy and the practice of Bayesian statistics.

    PubMed

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  7. A Model-Driven Approach for Telecommunications Network Services Definition

    NASA Astrophysics Data System (ADS)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  8. Identifiability of Additive, Time-Varying Actuator and Sensor Faults by State Augmentation

    NASA Technical Reports Server (NTRS)

    Upchurch, Jason M.; Gonzalez, Oscar R.; Joshi, Suresh M.

    2014-01-01

    Recent work has provided a set of necessary and sucient conditions for identifiability of additive step faults (e.g., lock-in-place actuator faults, constant bias in the sensors) using state augmentation. This paper extends these results to an important class of faults which may affect linear, time-invariant systems. In particular, the faults under consideration are those which vary with time and affect the system dynamics additively. Such faults may manifest themselves in aircraft as, for example, control surface oscillations, control surface runaway, and sensor drift. The set of necessary and sucient conditions presented in this paper are general, and apply when a class of time-varying faults affects arbitrary combinations of actuators and sensors. The results in the main theorems are illustrated by two case studies, which provide some insight into how the conditions may be used to check the theoretical identifiability of fault configurations of interest for a given system. It is shown that while state augmentation can be used to identify certain fault configurations, other fault configurations are theoretically impossible to identify using state augmentation, giving practitioners valuable insight into such situations. That is, the limitations of state augmentation for a given system and configuration of faults are made explicit. Another limitation of model-based methods is that there can be large numbers of fault configurations, thus making identification of all possible configurations impractical. However, the theoretical identifiability of known, credible fault configurations can be tested using the theorems presented in this paper, which can then assist the efforts of fault identification practitioners.

  9. Using State Merging and State Pruning to Address the Path Explosion Problem Faced by Symbolic Execution

    DTIC Science & Technology

    2014-06-19

    urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n

  10. Effect of temperature on the acid-base properties of the alumina surface: microcalorimetry and acid-base titration experiments.

    PubMed

    Morel, Jean-Pierre; Marmier, Nicolas; Hurel, Charlotte; Morel-Desrosiers, Nicole

    2006-06-15

    Sorption reactions on natural or synthetic materials that can attenuate the migration of pollutants in the geosphere could be affected by temperature variations. Nevertheless, most of the theoretical models describing sorption reactions are at 25 degrees C. To check these models at different temperatures, experimental data such as the enthalpies of sorption are thus required. Highly sensitive microcalorimeters can now be used to determine the heat effects accompanying the sorption of radionuclides on oxide-water interfaces, but enthalpies of sorption cannot be extracted from microcalorimetric data without a clear knowledge of the thermodynamics of protonation and deprotonation of the oxide surface. However, the values reported in the literature show large discrepancies and one must conclude that, amazingly, this fundamental problem of proton binding is not yet resolved. We have thus undertaken to measure by titration microcalorimetry the heat effects accompanying proton exchange at the alumina-water interface at 25 degrees C. Based on (i) the surface sites speciation provided by a surface complexation model (built from acid-base titrations at 25 degrees C) and (ii) results of the microcalorimetric experiments, calculations have been made to extract the enthalpic variations associated respectively to first and second deprotonation of the alumina surface. Values obtained are deltaH1 = 80+/-10 kJ mol(-1) and deltaH2 = 5+/-3 kJ mol(-1). In a second step, these enthalpy values were used to calculate the alumina surface acidity constants at 50 degrees C via the van't Hoff equation. Then a theoretical titration curve at 50 degrees C was calculated and compared to the experimental alumina surface titration curve. Good agreement between the predicted acid-base titration curve and the experimental one was observed.

  11. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program

    PubMed Central

    Wein, Lawrence M.; Baveja, Manas

    2005-01-01

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are ≈11–22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold. PMID:15894628

  12. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program.

    PubMed

    Wein, Lawrence M; Baveja, Manas

    2005-05-24

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are approximately 11-22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold.

  13. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  14. Terrestrial laser scanning used to detect asymmetries in boat hulls

    NASA Astrophysics Data System (ADS)

    Roca-Pardiñas, Javier; López-Alvarez, Francisco; Ordóñez, Celestino; Menéndez, Agustín; Bernardo-Sánchez, Antonio

    2012-01-01

    We describe a methodology for identifying asymmetries in boat hull sections reconstructed from point clouds captured using a terrestrial laser scanner (TLS). A surface was first fit to the point cloud using a nonparametric regression method that permitted the construction of a continuous smooth surface. Asymmetries in cross-sections of the surface were identified using a bootstrap resampling technique that took into account uncertainty in the coordinates of the scanned points. Each reconstructed section was analyzed to check, for a given level of significance, that it was within the confidence interval for the theoretical symmetrical section. The method was applied to the study of asymmetries in a medium-sized yacht. Identified were differences of up to 5 cm between the real and theoretical sections in some parts of the hull.

  15. Non-classical Signature of Parametric Fluorescence and its Application in Metrology

    NASA Astrophysics Data System (ADS)

    Hamar, M.; Michálek, V.; Pathak, A.

    2014-08-01

    The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.

  16. Optical CAD Utilization for the Design and Testing of a LED Streetlamp.

    PubMed

    Jafrancesco, David; Mercatelli, Luca; Fontani, Daniela; Sansoni, Paola

    2017-08-24

    The design and testing of LED lamps are vital steps toward broader use of LED lighting for outdoor illumination and traffic signalling. The characteristics of LED sources, in combination with the need to limit light pollution and power consumption, require a precise optical design. In particular, in every step of the process, it is important to closely compare theoretical or simulated results with measured data (obtained from a prototype). This work examines the various possibilities for using an optical CAD (Lambda Research TracePro ) to design and check a LED lamp for outdoor use. This analysis includes the simulations and testing on a prototype as an example; data acquired by measurement are inserted into the same simulation software, making it easy to compare theoretical and actual results.

  17. Parallel Software Model Checking

    DTIC Science & Technology

    2015-01-08

    checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08

  18. Stochastic Game Analysis and Latency Awareness for Self-Adaptation

    DTIC Science & Technology

    2014-01-01

    this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to

  19. Exact solution for the quench dynamics of a nested integrable system

    NASA Astrophysics Data System (ADS)

    Mestyán, Márton; Bertini, Bruno; Piroli, Lorenzo; Calabrese, Pasquale

    2017-08-01

    Integrable models provide an exact description for a wide variety of physical phenomena. For example nested integrable systems contain different species of interacting particles with a rich phenomenology in their collective behavior, which is the origin of the unconventional phenomenon of spin-charge separation. So far, however, most of the theoretical work in the study of non-equilibrium dynamics of integrable systems has focussed on models with an elementary (i.e. not nested) Bethe ansatz. In this work we explicitly investigate quantum quenches in nested integrable systems, by generalizing the application of the quench action approach. Specifically, we consider the spin-1 Lai-Sutherland model, described, in the thermodynamic limit, by the theory of two different species of Bethe-ansatz particles, each one forming an infinite number of bound states. We focus on the situation where the quench dynamics starts from a simple matrix product state for which the overlaps with the eigenstates of the Hamiltonian are known. We fully characterize the post-quench steady state and perform several consistency checks for the validity of our results. Finally, we provide predictions for the propagation of entanglement and mutual information after the quench, which can be used as signature of the quasi-particle content of the model.

  20. The importance of age-related differences in prospective memory: Evidence from diffusion model analyses.

    PubMed

    Ball, B Hunter; Aschenbrenner, Andrew J

    2017-06-09

    Event-based prospective memory (PM) refers to relying on environmental cues to trigger retrieval of a deferred action plan from long-term memory. Considerable research has demonstrated PM declines with increased age. Despite efforts to better characterize the attentional processes that underlie these decrements, the majority of research has relied on measures of central tendency to inform theoretical accounts of PM that may not entirely capture the underlying dynamics involved in allocating attention to intention-relevant information. The purpose of the current study was to examine the utility of the diffusion model to better understand the cognitive processes underlying age-related differences in PM. Results showed that emphasizing the importance of the PM intention increased cue detection selectively for older adults. Standard cost analyses revealed that PM importance increased mean response times and accuracy, but not differentially for young and older adults. Consistent with this finding, diffusion model analyses demonstrated that PM importance increased response caution as evidenced by increased boundary separation. However, the selective benefit in cue detection for older adults may reflect peripheral target-checking processes as indicated by changes in nondecision time. These findings highlight the use of modeling techniques to better characterize the processes underlying the relations among aging, attention, and PM.

  1. Probabilistic Priority Message Checking Modeling Based on Controller Area Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.

  2. FOEHN: The critical experiment for the Franco-German High Flux Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scharmer, K.; Eckert, H. G.

    1991-01-01

    A critical experiment for the Franco-German High Flux Reactor was carried out in the French reactor EOLE (CEN Cadarache). The purpose of the experiment was to check the calculation methods in a realistic geometry and to measure effects that can only be calculated imprecisely (e.g. beam hole effects). The structure of the experiment and the measurement and calculation methods are described. A detailed comparison between theoretical and experimental results was performed. 30 refs., 105 figs.

  3. Effect of Finite Particle Size on Convergence of Point Particle Models in Euler-Lagrange Multiphase Dispersed Flow

    NASA Astrophysics Data System (ADS)

    Nili, Samaun; Park, Chanyoung; Haftka, Raphael T.; Kim, Nam H.; Balachandar, S.

    2017-11-01

    Point particle methods are extensively used in simulating Euler-Lagrange multiphase dispersed flow. When particles are much smaller than the Eulerian grid the point particle model is on firm theoretical ground. However, this standard approach of evaluating the gas-particle coupling at the particle center fails to converge as the Eulerian grid is reduced below particle size. We present an approach to model the interaction between particles and fluid for finite size particles that permits convergence. We use the generalized Faxen form to compute the force on a particle and compare the results against traditional point particle method. We apportion the different force components on the particle to fluid cells based on the fraction of particle volume or surface in the cell. The application is to a one-dimensional model of shock propagation through a particle-laden field at moderate volume fraction, where the convergence is achieved for a well-formulated force model and back coupling for finite size particles. Comparison with 3D direct fully resolved numerical simulations will be used to check if the approach also improves accuracy compared to the point particle model. Work supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  4. Design Principles as a Guide for Constraint Based and Dynamic Modeling: Towards an Integrative Workflow.

    PubMed

    Sehr, Christiana; Kremling, Andreas; Marin-Sanguino, Alberto

    2015-10-16

    During the last 10 years, systems biology has matured from a fuzzy concept combining omics, mathematical modeling and computers into a scientific field on its own right. In spite of its incredible potential, the multilevel complexity of its objects of study makes it very difficult to establish a reliable connection between data and models. The great number of degrees of freedom often results in situations, where many different models can explain/fit all available datasets. This has resulted in a shift of paradigm from the initially dominant, maybe naive, idea of inferring the system out of a number of datasets to the application of different techniques that reduce the degrees of freedom before any data set is analyzed. There is a wide variety of techniques available, each of them can contribute a piece of the puzzle and include different kinds of experimental information. But the challenge that remains is their meaningful integration. Here we show some theoretical results that enable some of the main modeling approaches to be applied sequentially in a complementary manner, and how this workflow can benefit from evolutionary reasoning to keep the complexity of the problem in check. As a proof of concept, we show how the synergies between these modeling techniques can provide insight into some well studied problems: Ammonia assimilation in bacteria and an unbranched linear pathway with end-product inhibition.

  5. MOM: A meteorological data checking expert system in CLIPS

    NASA Technical Reports Server (NTRS)

    Odonnell, Richard

    1990-01-01

    Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.

  6. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  7. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  8. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  9. Novice to Expert Practice via Postprofessional Athletic Training Education: A Grounded Theory

    PubMed Central

    Neibert, Peter J

    2009-01-01

    Objective: To discover the theoretic constructs that confirm, disconfirm, or extend the principles and their applications appropriate for National Athletic Trainers' Association (NATA)–accredited postprofessional athletic training education programs. Design: Interviews at the 2003 NATA Annual Meeting & Clinical Symposia. Setting: Qualitative study using grounded theory procedures. Patients and Other Participants: Thirteen interviews were conducted with postprofessional graduates. Participants were purposefully selected based on theoretic sampling and availability. Data Collection and Analysis: The transcribed interviews were analyzed using open coding, axial coding, and selective coding procedures. Member checks, reflective journaling, and triangulation were used to ensure trustworthiness. Results: The participants' comments confirmed and extended the current principles of postprofessional athletic training education programs and offered additional suggestions for more effective practical applications. Conclusions: The emergence of this central category of novice to expert practice is a paramount finding. The tightly woven fabric of the 10 processes, when interlaced with one another, provides a strong tapestry supporting novice to expert practice via postprofessional athletic training education. The emergence of this theoretic position pushes postprofessional graduate athletic training education forward to the future for further investigation into the theoretic constructs of novice to expert practice. PMID:19593420

  10. Fast passage dynamic nuclear polarization on rotating solids

    NASA Astrophysics Data System (ADS)

    Mentink-Vigier, Frederic; Akbey, Ümit; Hovav, Yonatan; Vega, Shimon; Oschkinat, Hartmut; Feintuch, Akiva

    2012-11-01

    Magic Angle Spinning (MAS) Dynamic Nuclear Polarization (DNP) has proven to be a very powerful way to improve the signal to noise ratio of NMR experiments on solids. The experiments have in general been interpreted considering the Solid-Effect (SE) and Cross-Effect (CE) DNP mechanisms while ignoring the influence of sample spinning. In this paper, we show experimental data of MAS-DNP enhancements of 1H and 13C in proline and SH3 protein in glass forming water/glycerol solvent containing TOTAPOL. We also introduce a theoretical model that aims at explaining how the nuclear polarization is built in MAS-DNP experiments. By using Liouville space based simulations to include relaxation on two simple spin models, {electron-nucleus} and {electron-electron-nucleus}, we explain how the basic MAS-SE-DNP and MAS-CE-DNP processes work. The importance of fast energy passages and short level anti-crossing is emphasized and the differences between static DNP and MAS-DNP is explained. During a single rotor cycle the enhancement in the {electron-electron-nucleus} system arises from MAS-CE-DNP involving at least three kinds of two-level fast passages: an electron-electron dipolar anti-crossing, a single quantum electron MW encounter and an anti-crossing at the CE condition inducing nuclear polarization in- or decrements. Numerical, powder-averaged, simulations were performed in order to check the influence of the experimental parameters on the enhancement efficiencies. In particular we show that the spinning frequency dependence of the theoretical MAS-CE-DNP enhancement compares favorably with the experimental 1H and 13C MAS-DNP enhancements of proline and SH3.

  11. Exploring Science Teachers' Affective States: Pedagogical Discontentment, Self-efficacy, Intentions to Reform, and Their Relationships

    NASA Astrophysics Data System (ADS)

    Kahveci, Ajda; Kahveci, Murat; Mansour, Nasser; Alarfaj, Maher Mohammed

    2017-06-01

    Teachers play a key role in moving reform-based science education practices into the classroom. Based on research that emphasizes the importance of teachers' affective states, this study aimed to explore the constructs pedagogical discontentment, science teaching self-efficacy, intentions to reform, and their correlations. Also, it aimed to provide empirical evidence in light of a previously proposed theoretical model while focusing on an entirely new context in Middle East. Data were collected in Saudi Arabia with a total of randomly selected 994 science teachers, 656 of whom were females and 338 were males. To collect the data, the Arabic versions of the Science Teachers' Pedagogical Discontentment scale, the Science Teaching Efficacy Beliefs Instrument and the Intentions to Reform Science Teaching scale were developed. For assuring the validity of the instruments in a non-Western context, rigorous cross-cultural validations procedures were followed. Factor analyses were conducted for construct validation and descriptive statistical analyses were performed including frequency distributions and normality checks. Univariate analyses of variance were run to explore statistically significant differences between groups of teachers. Cross-tabulation and correlation analyses were conducted to explore relationships. The findings suggest effect of teacher characteristics such as age and professional development program attendance on the affective states. The results demonstrate that teachers who attended a relatively higher number of programs had lower level of intentions to reform raising issues regarding the conduct and outcomes of professional development. Some of the findings concerning interrelationships among the three constructs challenge and serve to expand the previously proposed theoretical model.

  12. Personal food systems of male collegiate football players: a grounded theory investigation.

    PubMed

    Long, Doug; Perry, Christina; Unruh, Scott A; Lewis, Nancy; Stanek-Krogstrand, Kaye

    2011-01-01

    Factors that affect food choices include the physical and social environments, quality, quantity, perceived healthfulness, and convenience. The personal food choice process was defined as the procedures used by athletes for making food choices, including the weighing and balancing of activities of daily life, physical well-being, convenience, monetary resources, and social relationships. To develop a theoretical model explaining the personal food choice processes of collegiate football players. Qualitative study. National Collegiate Athletic Association Division II football program. Fifteen football players were purposefully sampled to represent various positions, years of athletic eligibility, and ethnic backgrounds. For text data collection, we used predetermined, open-ended questions. Data were analyzed using the constant comparison method. The athletes' words were used to label and describe their interactions and experiences with the food choice process. Member checks and an external audit were conducted by a qualitative methodologist and a nutrition specialist, and the findings were triangulated with the current literature to ensure trustworthiness of the text data. Time was the core category and yielded a cyclic graphic of a theoretical model for the food choice system. Planning hydration, macronutrient strategies, snacks, and healthful food choices emerged as themes. The athletes planned meals and snacks around their academic and athletic schedules while attempting to consume foods identified as healthful. Healthful foods were generally lower in fat but high in preferred macronutrients. High-protein foods were the players' primary goal; carbohydrate consumption was secondary. The athletes had established plans to maintain hydration. Professionals may use these findings to implement educational programs on food choices for football players.

  13. Generalized Symbolic Execution for Model Checking and Testing

    NASA Technical Reports Server (NTRS)

    Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)

    2003-01-01

    Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.

  14. Model Checking Degrees of Belief in a System of Agents

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Primero, Giuseppe; Rungta, Neha

    2014-01-01

    Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.

  15. Test and Evaluation Report of the IVAC (Trademark) Vital Check Monitor Model 4000AEE

    DTIC Science & Technology

    1992-02-01

    AD-A248 834 111111 jIf+l l’ l USAARL Report No. 92-14 Test and Evaluation Report of the IVAC® Vital Check Monitor DTI ~cModel 4000AEE f ELECTE APR17...does not constitute an official Department of the Army endorsement or approval of the use ot such commercial items. Reviewed: DENNIS F . SHANAHAN LTC, MC...to 12.4 GHz) was scanned for emissions. The IVACO Model 4000AEE was operated with both ac and battery power. 2.10.3.2 The radiated susceptibility

  16. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  17. Utilisation of preventative health check-ups in the UK: findings from individual-level repeated cross-sectional data from 1992 to 2008

    PubMed Central

    Labeit, Alexander; Peinemann, Frank; Baker, Richard

    2013-01-01

    Objectives To analyse and compare the determinants of screening uptake for different National Health Service (NHS) health check-ups in the UK. Design Individual-level analysis of repeated cross-sectional surveys with balanced panel data. Setting The UK. Participants Individuals taking part in the British Household Panel Survey (BHPS), 1992–2008. Outcome measure Uptake of NHS health check-ups for cervical cancer screening, breast cancer screening, blood pressure checks, cholesterol tests, dental screening and eyesight tests. Methods Dynamic panel data models (random effects panel probit with initial conditions). Results Having had a health check-up 1 year before, and previously in accordance with the recommended schedule, was associated with higher uptake of health check-ups. Individuals who visited a general practitioner (GP) had a significantly higher uptake in 5 of the 6 health check-ups. Uptake was highest in the recommended age group for breast and cervical cancer screening. For all health check-ups, age had a non-linear relationship. Lower self-rated health status was associated with increased uptake of blood pressure checks and cholesterol tests; smoking was associated with decreased uptake of 4 health check-ups. The effects of socioeconomic variables differed for the different health check-ups. Ethnicity did not have a significant influence on any health check-up. Permanent household income had an influence only on eyesight tests and dental screening. Conclusions Common determinants for having health check-ups are age, screening history and a GP visit. Policy interventions to increase uptake should consider the central role of the GP in promoting screening examinations and in preserving a high level of uptake. Possible economic barriers to access for prevention exist for dental screening and eyesight tests, and could be a target for policy intervention. Trial registration This observational study was not registered. PMID:24366576

  18. How can machine-learning methods assist in virtual screening for hyperuricemia? A healthcare machine-learning approach.

    PubMed

    Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi

    2016-12-01

    Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Reopen parameter regions in two-Higgs doublet models

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2018-01-01

    The stability of the electroweak potential is a very important constraint for models of new physics. At the moment, it is standard for Two-Higgs doublet models (THDM), singlet or triplet extensions of the standard model to perform these checks at tree-level. However, these models are often studied in the presence of very large couplings. Therefore, it can be expected that radiative corrections to the potential are important. We study these effects at the example of the THDM type-II and find that loop corrections can revive more than 50% of the phenomenological viable points which are ruled out by the tree-level vacuum stability checks. Similar effects are expected for other extension of the standard model.

  20. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L

    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less

  1. A Two-Radius Circular Array Method: Extracting Independent Information on Phase Velocities of Love Waves From Microtremor Records From a Simple Seismic Array

    NASA Astrophysics Data System (ADS)

    Tada, T.; Cho, I.; Shinozaki, Y.

    2005-12-01

    We have invented a Two-Radius (TR) circular array method of microtremor exploration, an algorithm that enables to estimate phase velocities of Love waves by analyzing horizontal-component records of microtremors that are obtained with an array of seismic sensors placed around circumferences of two different radii. The data recording may be done either simultaneously around the two circles or in two separate sessions with sensors distributed around each circle. Both Rayleigh and Love waves are present in the horizontal components of microtremors, but in the data processing of our TR method, all information on the Rayleigh waves ends up cancelled out, and information on the Love waves alone are left to be analyzed. Also, unlike the popularly used frequency-wavenumber spectral (F-K) method, our TR method does not resolve individual plane-wave components arriving from different directions and analyze their "vector" phase velocities, but instead directly evaluates their "scalar" phase velocities --- phase velocities that contain no information on the arrival direction of waves --- through a mathematical procedure which involves azimuthal averaging. The latter feature leads us to expect that, with our TR method, it is possible to conduct phase velocity analysis with smaller numbers of sensors, with higher stability, and up to longer-wavelength ranges than with the F-K method. With a view to investigating the capabilities and limitations of our TR method in practical implementation to real data, we have deployed circular seismic arrays of different sizes at a test site in Japan where the underground structure is well documented through geophysical exploration. Ten seismic sensors were placed equidistantly around two circumferences, five around each circle, with varying combinations of radii ranging from several meters to several tens of meters, and simultaneous records of microtremors around circles of two different radii were analyzed with our TR method to produce estimates for the phase velocities of Love waves. The estimates were then checked against "model" phase velocities that are derived from theoretical calculations. We have also conducted a check of the estimated spectral ratios against the "model" spectral ratios, where we mean by "spectral ratio" an intermediary quantity that is calculated from observed records prior to the estimation of the phase velocity in the data analysis procedure of our TR method. In most cases, the estimated phase velocities coincided well with the model phase velocities within a wavelength range extending roughly from 3r to 6r (r: array radius). It was found out that, outside the upper and lower resolution limits of the TR method, the discrepancy between the estimated and model phase velocities, as well as the discrepancy between the estimated and model spectral ratios, were accounted for satisfactorily by theoretical consideration of three factors: the presence of higher surface-wave modes, directional aliasing effects related to the finite number of sensors in the seismic array, and the presence of incoherent noise.

  2. 10 CFR 35.2642 - Records of periodic spot-checks for teletherapy units.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....2642 Section 35.2642 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Records... must include— (1) The date of the spot-check; (2) The manufacturer's name, model number, and serial... device; (6) The determined accuracy of each distance measuring and localization device; (7) The...

  3. 10 CFR 35.2642 - Records of periodic spot-checks for teletherapy units.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....2642 Section 35.2642 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Records... must include— (1) The date of the spot-check; (2) The manufacturer's name, model number, and serial... device; (6) The determined accuracy of each distance measuring and localization device; (7) The...

  4. 12 CFR Appendix A to Part 205 - Model Disclosure Clauses and Forms

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... your checking account using information from your check to: (i) Pay for purchases. (ii) Pay bills. (3... disclose information to third parties about your account or the transfers you make: (i) Where it is...) Disclosure by government agencies of information about obtaining account balances and account histories...

  5. Using computer models to design gully erosion control structures for humid northern Ethiopia

    USDA-ARS?s Scientific Manuscript database

    Classic gully erosion control measures such as check dams have been unsuccessful in halting gully formation and growth in the humid northern Ethiopian highlands. Gullies are typically formed in vertisols and flow often bypasses the check dams as elevated groundwater tables make gully banks unstable....

  6. Posterior Predictive Checks for Conditional Independence between Response Time and Accuracy

    ERIC Educational Resources Information Center

    Bolsinova, Maria; Tijmstra, Jesper

    2016-01-01

    Conditional independence (CI) between response time and response accuracy is a fundamental assumption of many joint models for time and accuracy used in educational measurement. In this study, posterior predictive checks (PPCs) are proposed for testing this assumption. These PPCs are based on three discrepancy measures reflecting different…

  7. Building Program Verifiers from Compilers and Theorem Provers

    DTIC Science & Technology

    2015-05-14

    Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015

  8. Exploration of Effective Persuasive Strategies Used in Resisting Product Advertising: A Case Study of Adult Health Check-Ups.

    PubMed

    Tien, Han-Kuang; Chung, Wen

    2018-05-10

    This research addressed adults' health check-ups through the lens of Role Transportation Theory. This theory is applied to narrative advertising that lures adults into seeking health check-ups by causing audiences to empathize with the advertisement's character. This study explored the persuasive mechanism behind narrative advertising and reinforced the Protection Motivation Theory model. We added two key perturbation variables: optimistic bias and truth avoidance. To complete the verification hypothesis, we performed two experiments. In Experiment 1, we recruited 77 respondents online for testing. We used analyses of variance to verify the effectiveness of narrative and informative advertising. Then, in Experiment 2, we recruited 228 respondents to perform offline physical experiments and conducted a path analysis through structural equation modelling. The findings showed that narrative advertising positively impacted participants' disease prevention intentions. The use of Role Transportation Theory in advertising enables the audience to be emotionally connected with the character, which enhances persuasiveness. In Experiment 2, we found that the degree of role transference can interfere with optimistic bias, improve perceived health risk, and promote behavioral intentions for health check-ups. Furthermore, truth avoidance can interfere with perceived health risks, which, in turn, reduce behavioral intentions for health check-ups.

  9. Dynamic correlation effects in fully differential cross sections for 75-keV proton-impact ionization of helium

    NASA Astrophysics Data System (ADS)

    Niu, Xiaojie; Sun, Shiyan; Wang, Fujun; Jia, Xiangfu

    2017-08-01

    The effect of final-state dynamic correlation is investigated for helium single ionization by 75-keV proton impact analyzing fully differential cross sections (FDCS). The final state is represented by a continuum correlated wave (CCW-PT) function which accounts for the interaction between the projectile and the residual target ion (PT interaction). This continuum correlated wave function partially includes the correlation of electron-projectile and electron-target relative motion as coupling terms of the wave equation. The transition matrix is evaluated using the CCW-PT function and the Born initial state. The analytical expression of the transition matrix has been obtained. We have shown that this series is strongly convergent and analyzed the contribution of their different terms to the FDCS within the perturbation method. Illustrative computations are performed in the scattering plane and in the perpendicular plane. Both the correlation effects and the PT interaction are checked by the preset calculations. Our results are compared with absolute experimental data as well as other theoretical models. We have shown that the dynamic correlation plays an important role in the single ionization of atoms by proton impact at intermediate projectile energies, especially at large transverse momentum transfer. While overall agreement between theory and the experimental data is encouraging, detailed agreement is lacking. The need for more theoretical and experimental work is emphasized.

  10. Predictability in Cellular Automata

    PubMed Central

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case. PMID:25271778

  11. Diffusion lengths in irradiated N/P InP-on-Si solar cells

    NASA Technical Reports Server (NTRS)

    Wojtczuk, Steven; Colerico, Claudia; Summers, Geoffrey P.; Walters, Robert J.; Burke, Edward A.

    1995-01-01

    Indium phosphide (InP) solar cells are being made on silicon (Si) wafers (InP/Si) to take advantage of both the radiation-hardness properties of the InP solar cell and the light weight and low cost of Si wafers compared to InP or germanium (Ge) wafers. The InP/Si cell application is for long duration and/or high radiation orbit space missions. InP/Si cells have higher absolute efficiency after a high radiation dose than gallium arsenide (GaAs) or silicon (Si) solar cells. In this work, base electron diffusion lengths in the N/P cell are extracted from measured AM0 short-circuit photocurrent at various irradiation levels out to an equivalent 1 MeV fluence of 1017 1 MeV electrons/sq cm for a 1 sq cm 12% BOL InP/Si cell. These values are then checked for consistency by comparing measured Voc data with a theoretical Voc model that includes a dark current term that depends on the extracted diffusion lengths.

  12. Femtogram Mass Biosensor Using Self-Sensing Cantilever for Allergy Check

    NASA Astrophysics Data System (ADS)

    Sone, Hayato; Ikeuchi, Ayumi; Izumi, Takashi; Okano, Haruki; Hosaka, Sumio

    2006-03-01

    A self-sensing mass biosensor with a femtogram mass sensitivity has been developed using a piezoresistive microcantilever. The mass change due to antigen and antibody adsorption on the cantilever in water was detected by the resonance frequency shift of the cantilever. We constructed a prototype harmonic vibration sensor using a commercial piezoresistive cantilever, Wheatstone bridge circuits, a positive feedback controller, an exciting piezoactuator and a phase-locked loop (PLL) demodulator. As experimental results, a mass sensitivity of about 190 fg/Hz, and a mass resolution of about 500 fg were obtained in water. The mass sensitivity is 100 times higher than that of a quartz crystal oscillation method. We demonstrated that the sensor can detect the reaction between an antibody of immunoglobulin (IgG) and an antigen of egg albumen (OVA). We confirmed that the binding ratio between the antibody and the antigen was about 1 : 2. The detection method is available for allergy check because the measured reaction ratio occurring on the cantilever concurs with the theoretical method.

  13. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  14. Derivation and Applicability of Asymptotic Results for Multiple Subtests Person-Fit Statistics

    PubMed Central

    Albers, Casper J.; Meijer, Rob R.; Tendeiro, Jorge N.

    2016-01-01

    In high-stakes testing, it is important to check the validity of individual test scores. Although a test may, in general, result in valid test scores for most test takers, for some test takers, test scores may not provide a good description of a test taker’s proficiency level. Person-fit statistics have been proposed to check the validity of individual test scores. In this study, the theoretical asymptotic sampling distribution of two person-fit statistics that can be used for tests that consist of multiple subtests is first discussed. Second, simulation study was conducted to investigate the applicability of this asymptotic theory for tests of finite length, in which the correlation between subtests and number of items in the subtests was varied. The authors showed that these distributions provide reasonable approximations, even for tests consisting of subtests of only 10 items each. These results have practical value because researchers do not have to rely on extensive simulation studies to simulate sampling distributions. PMID:29881053

  15. Nonlinear growth dynamics and the origin of fluctuating asymmetry

    USGS Publications Warehouse

    Emlen, J.M.; Freeman, D.C.; Graham, J.H.

    1993-01-01

    The nonlinear, complex nature of biosynthesis magnifies the impacts of small, random perturbations on organism growth, leading to distortions in adaptive allometries and, in particular, to fluctuating asymmetry. These distortions can be partly checked by cell-cell and inter-body part feedback during growth and development, though the latter mechanism also may lead to complex patterns in right-left asymmetry. Stress can be expected to increase the degree to which random growth perturbations are magnified and may also result in disruption of the check mechanisms, thus exaggerating fluctuating asymmetry.The processes described not only provide one explanation for the existence of fluctuating asymmetry and its augmentation under stress, but suggest additional effects of stress as well. Specifically, stress is predicted to lead to decreased fractal dimension of bone sutures and branching structures in animals, and in increased dimension of growth trace patterns such as those found in mollusc shells and fish otoliths and scales.A basic yet broad primer on fractals and chaos is provided as background for the theoretical development in this manuscript.

  16. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  17. Introduction of Virtualization Technology to Multi-Process Model Checking

    NASA Technical Reports Server (NTRS)

    Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu

    2009-01-01

    Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.

  18. Body checking is associated with weight- and body-related shame and weight- and body-related guilt among men and women.

    PubMed

    Solomon-Krakus, Shauna; Sabiston, Catherine M

    2017-12-01

    This study examined whether body checking was a correlate of weight- and body-related shame and guilt for men and women. Participants were 537 adults (386 women) between the ages of 17 and 74 (M age =28.29, SD=14.63). Preliminary analyses showed women reported significantly more body-checking (p<.001), weight- and body-related shame (p<.001), and weight- and body-related guilt (p<.001) than men. In sex-stratified hierarchical linear regression models, body checking was significantly and positively associated with weight- and body-related shame (R 2 =.29 and .43, p<.001) and weight- and body-related guilt (R 2 =.34 and .45, p<.001) for men and women, respectively. Based on these findings, body checking is associated with negative weight- and body-related self-conscious emotions. Intervention and prevention efforts aimed at reducing negative weight- and body-related self-conscious emotions should consider focusing on body checking for adult men and women. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Solution of magnetic field and eddy current problem induced by rotating magnetic poles (abstract)

    NASA Astrophysics Data System (ADS)

    Liu, Z. J.; Low, T. S.

    1996-04-01

    The magnetic field and eddy current problems induced by rotating permanent magnet poles occur in electromagnetic dampers, magnetic couplings, and many other devices. Whereas numerical techniques, for example, finite element methods can be exploited to study various features of these problems, such as heat generation and drag torque development, etc., the analytical solution is always of interest to the designers since it helps them to gain the insight into the interdependence of the parameters involved and provides an efficient tool for designing. Some of the previous work showed that the solution of the eddy current problem due to the linearly moving magnet poles can give satisfactory approximation for the eddy current problem due to rotating fields. However, in many practical cases, especially when the number of magnet poles is small, there is significant effect of flux focusing due to the geometry. The above approximation can therefore lead to marked errors in the theoretical predictions of the device performance. Bernot et al. recently described an analytical solution in a polar coordinate system where the radial field is excited by a time-varying source. A discussion of an analytical solution of the magnetic field and eddy current problems induced by moving magnet poles in radial field machines will be given in this article. The theoretical predictions obtained from this method is compared with the results obtained from finite element calculations. The validity of the method is also checked by the comparison of the theoretical predictions and the measurements from a test machine. It is shown that the introduced solution leads to a significant improvement in the air gap field prediction as compared with the results obtained from the analytical solution that models the eddy current problems induced by linearly moving magnet poles.

  20. Study on the Effect of water Injection Momentum on the Cooling Effect of Rocket Engine Exhaust Plume

    NASA Astrophysics Data System (ADS)

    Yang, Kan; Qiang, Yanhui; Zhong, Chenghang; Yu, Shaozhen

    2017-10-01

    For the study of water injection momentum factors impact on flow field of the rocket engine tail flame, the numerical computation model of gas-liquid two phase flow in the coupling of high temperature and high speed gas flow and low temperature liquid water is established. The accuracy and reliability of the numerical model are verified by experiments. Based on the numerical model, the relationship between the flow rate and the cooling effect is analyzed by changing the water injection momentum of the water spray pipes. And the effective mathematical expression is obtained. What’s more, by changing the number of the water spray and using small flow water injection, the cooling effect is analyzed to check the application range of the mathematical expressions. The results show that: the impact and erosion of the gas flow field could be reduced greatly by water injection, and there are two parts in the gas flow field, which are the slow cooling area and the fast cooling area. In the fast cooling area, the influence of the water flow momentum and nozzle quantity on the cooling effect can be expressed by mathematical functions without causing bifurcation flow for the mainstream gas. The conclusion provides a theoretical reference for the engineering application.

  1. Practical Results from the Application of Model Checking and Test Generation from UML/SysML Models of On-Board Space Applications

    NASA Astrophysics Data System (ADS)

    Faria, J. M.; Mahomad, S.; Silva, N.

    2009-05-01

    The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.

  2. An experimental manipulation of responsibility in children: a test of the inflated responsibility model of obsessive-compulsive disorder.

    PubMed

    Reeves, J; Reynolds, S; Coker, S; Wilson, C

    2010-09-01

    The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9-12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children's self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children's perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children's self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children. (c) 2010 Elsevier Ltd. All rights reserved.

  3. 75 FR 52482 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-26

    ..., check the airplane maintenance records to determine if the left and/or right aileron outboard bearing... an entry is found during the airplane maintenance records check required in paragraph (f)(1) of this...-0849; Directorate Identifier 2010-CE-043-AD] RIN 2120-AA64 Airworthiness Directives; PILATUS Aircraft...

  4. 77 FR 50644 - Airworthiness Directives; Cessna Airplane Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... airplanes that have P/N 1134104-1 or 1134104-5 A/C compressor motor installed; an aircraft logbook check for... following: (1) Inspect the number of hours on the A/C compressor hour meter; and (2) Check the aircraft.... Do the replacement following Cessna Aircraft Company Model 525 Maintenance Manual, Revision 23, dated...

  5. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  6. CCM-C,Collins checks the middeck experiment

    NASA Image and Video Library

    1999-07-24

    S93-E-5016 (23 July 1999) --- Astronaut Eileen M. Collins, mission commander, checks on an experiment on Columbia's middeck during Flight Day 1 activity. The experiment is called the Cell Culture Model, Configuration C. Objectives of it are to validate cell culture models for muscle, bone and endothelial cell biochemical and functional loss induced by microgravity stress; to evaluate cytoskeleton, metabolism, membrane integrity and protease activity in target cells; and to test tissue loss pharmaceuticals for efficacy. The photo was recorded with an electronic still camera (ESC).

  7. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  8. Promoting prosocial pupil behaviour: 2-secondary school intervention and pupil effects.

    PubMed

    Mooij, T

    1999-12-01

    In an earlier article (Mooij, 1999c) a theoretical multilevel model to promote prosocial pupil behaviour by stimulating specific educational conditions was developed. To carry out school interventions to check empirically whether pupil level effects occur because of educational changes at the classroom and school level. Seven secondary schools with relatively high degrees of pupil aggression were selected. Four schools took part as intervention schools, three schools served as control schools. In 1995 (pretest) and 1997 (post-test) pupils and form teachers of the first and third school years participated by completing questionnaires. Within the pupil cohorts, a longitudinal group of 352 pupils was included. Pretest questionnaires in 1995 were followed by intervention in the intervention schools. Teachers collaborated with staff and researchers to increase pupils' participation and responsibility in specifying and controlling behavioural and didactic rules, related to didactic differentiation during lessons. The validity of the intervention implementation was checked using qualitative information and quantitative data from both pre- and post-test. Longitudinal intervention effects were tested by applying two-level multiple regression analyses. After controlling for pretest and covariables in school year 1, school intervention effects were found in school year 3 with the prediction of being a perpetrator of aggressive behaviour at school, aggressive behaviour outside school, and criminal behaviour. Some small effects were found with respect to victim behaviour. Social-pedagogical and didactic class and school variables, but also home variables and support by peers without problematic behaviour, could be integrated more systematically to promote prosocial development of a pupil's behaviour from the beginning in school.

  9. Health-Related Quality of Life and Its Correlates among Chinese Migrants in Small- and Medium-Sized Enterprises in Two Cities of Guangdong

    PubMed Central

    Lu, Liming; Zou, Guanyang; Zeng, Zhi; Han, Lu; Guo, Yan; Ling, Li

    2014-01-01

    Objectives To explore the relationship between health-related quality of life (HRQOL) status and associated factors among rural-to-urban migrants in China. Methods A cross-sectional survey was conducted with 856 rural-to-urban migrants working at small- and medium-size enterprises (SMEs) in Shenzhen and Zhongshan City in 2012. Andersen's behavioral model was used as a theoretical framework to exam the relationships among factors affecting HRQOL. Analysis was performed using structural equation modeling (SEM). Results Workers with statutory working hours, higher wages and less migrant experience had higher HRQOL scores. Need (contracting a disease in the past two weeks and perception of needing health service) had the greatest total effect on HRQOL (β = −0.78), followed by enabling (labor contract, insurance purchase, income, physical examination during work and training) (β = 0.40), predisposing (age, family separation, education) (β = 0.22) and health practices and use of health service (physical exercise weekly, health check-up and use of protective equipments) (β = −0.20). Conclusions Priority should be given to satisfy the needs of migrant workers, and improve the enabling resources. PMID:24392084

  10. [Proposal and preliminary validation of a check-list for the assessment of occupational exposure to repetitive movements of the upper lims].

    PubMed

    Colombini, D; Occhipinti, E; Cairoli, S; Baracco, A

    2000-01-01

    Over the last few years the Authors developed and implemented, a specific check-list for a "rapid" assessment of occupational exposure to repetitive movements and exertion of the upper limbs, after verifying the lack of such a tool which also had to be coherent with the latest data in the specialized literature. The check-list model and the relevant application procedures are presented and discussed. The check-list was applied by trained factory technicians in 46 different working tasks where the OCRA method previously proposed by the Authors was also applied by independent observers. Since 46 pairs of observation data were available (OCRA index and check-list score) it was possible to verify, via parametric and nonparametric statistical tests, the level of association between the two variables and to find the best simple regression function (exponential in this case) of the OCRA index from the check-list score. By means of this function, which was highly significant (R2 = 0.98, p < 0.0000), the values of the check-list score which better corresponded to the critical values (for exposure assessment) of the OCRA index looked for. The following correspondance values between OCRA Index and check-list were then established with a view to classifying exposure levels. The check-list "critical" scores were established considering the need for obtaining, in borderline cases, a potential effect of overestimation of the exposure level. On the basis of practical application experience and the preliminary validation results, recommendations are made and the caution needed in the use of the check-list is suggested.

  11. Socioeconomic differences in health check-ups and medically certified sickness absence: a 10-year follow-up among middle-aged municipal employees in Finland.

    PubMed

    Piha, Kustaa; Sumanen, Hilla; Lahelma, Eero; Rahkonen, Ossi

    2017-04-01

    There is contradictory evidence on the association between health check-ups and future morbidity. Among the general population, those with high socioeconomic position participate more often in health check-ups. The main aims of this study were to analyse if attendance to health check-ups are socioeconomically patterned and affect sickness absence over a 10-year follow-up. This register-based follow-up study included municipal employees of the City of Helsinki. 13 037 employees were invited to age-based health check-up during 2000-2002, with a 62% attendance rate. Education, occupational class and individual income were used to measure socioeconomic position. Medically certified sickness absence of 4 days or more was measured and controlled for at the baseline and used as an outcome over follow-up. The mean follow-up time was 7.5 years. Poisson regression was used. Men and employees with lower socioeconomic position participated more actively in health check-ups. Among women, non-attendance to health check-up predicted higher sickness absence during follow-up (relative risk =1.26, 95% CI 1.17 to 1.37) in the fully adjusted model. Health check-ups were not effective in reducing socioeconomic differences in sickness absence. Age-based health check-ups reduced subsequent sickness absence and should be promoted. Attendance to health check-ups should be as high as possible. Contextual factors need to be taken into account when applying the results in interventions in other settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  12. Pseudospin symmetry for modified Rosen-Morse potential including a Pekeris-type approximation to the pseudo-centrifugal term

    NASA Astrophysics Data System (ADS)

    Wei, Gao-Feng; Dong, Shi-Hai

    2010-11-01

    By applying a Pekeris-type approximation to the pseudo-centrifugal term, we study the pseudospin symmetry of a Dirac nucleon subjected to scalar and vector modified Rosen-Morse (MRM) potentials. A complicated quartic energy equation and spinor wave functions with arbitrary spin-orbit coupling quantum number k are presented. The pseudospin degeneracy is checked numerically. Pseudospin symmetry is discussed theoretically and numerically in the limit case α rightarrow 0 . It is found that the relativistic MRM potential cannot trap a Dirac nucleon in this limit.

  13. Polarizability of Helium, Neon, and Argon: New Perspectives for Gas Metrology

    NASA Astrophysics Data System (ADS)

    Gaiser, Christof; Fellmuth, Bernd

    2018-03-01

    With dielectric-constant gas thermometry, the molar polarizability of helium, neon, and argon has been determined with relative standard uncertainties of about 2 parts per million. A series of isotherms measured with the three noble gases and two different experimental setups led to this unprecedented level of uncertainty. These data are crucial for scientists in the field of gas metrology, working on pressure and temperature standards. Furthermore, with the new benchmark values for neon and argon, theoretical calculations, today about 3 orders of magnitude larger in uncertainty, can be checked and improved.

  14. Polarizability of Helium, Neon, and Argon: New Perspectives for Gas Metrology.

    PubMed

    Gaiser, Christof; Fellmuth, Bernd

    2018-03-23

    With dielectric-constant gas thermometry, the molar polarizability of helium, neon, and argon has been determined with relative standard uncertainties of about 2 parts per million. A series of isotherms measured with the three noble gases and two different experimental setups led to this unprecedented level of uncertainty. These data are crucial for scientists in the field of gas metrology, working on pressure and temperature standards. Furthermore, with the new benchmark values for neon and argon, theoretical calculations, today about 3 orders of magnitude larger in uncertainty, can be checked and improved.

  15. Checking the statistical theory of liquids by ultraacoustic measurements

    NASA Technical Reports Server (NTRS)

    Dima, V. N.

    1974-01-01

    The manner of theoretically obtaining radial distribution functions 9(r) for n-hexane as a function of temperature is described. With the aid of function g(r) the coefficient of dynamic viscosity and the coefficient of volumetric viscosity for temperatures ranging from 213 K to 273 K were calculated. With the aid of the two coefficients of viscosity the coefficient of absorption of ultrasounds in n-hexane referred to the square of the frequency was determined. The same values were measured experimentally. Comparison of theory with experiments resulted in satisfactory agreement.

  16. Experimental realization of quantum cheque using a five-qubit quantum computer

    NASA Astrophysics Data System (ADS)

    Behera, Bikash K.; Banerjee, Anindita; Panigrahi, Prasanta K.

    2017-12-01

    Quantum cheques could be a forgery-free way to make transaction in a quantum networked banking system with perfect security against any no-signalling adversary. Here, we demonstrate the implementation of quantum cheque, proposed by Moulick and Panigrahi (Quantum Inf Process 15:2475-2486, 2016), using the five-qubit IBM quantum computer. Appropriate single qubit, CNOT and Fredkin gates are used in an optimized configuration. The accuracy of implementation is checked and verified through quantum state tomography by comparing results from the theoretical and experimental density matrices.

  17. Quasi-cylindrical theory of wing-body interference at supersonic speeds and comparison with experiment

    NASA Technical Reports Server (NTRS)

    Nielsen, Jack N

    1955-01-01

    A theoretical method is presented for calculating the flow field about wing-body combinations employing bodies deviating only slightly in shape from a circular cylinder. The method is applied to the calculation of the pressure field acting between a circular cylindrical body and a rectangular wing. The case of zero body angle of attack and variable wing incidence is considered as well as the case of zero wing incidence and variable body angle of attack. An experiment was performed especially for the purpose of checking the calculative examples.

  18. Litho hotspots fixing using model based algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan

    2017-04-01

    As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.

  19. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  20. Social-cognitive determinants of the tick check: a cross-sectional study on self-protective behavior in combatting Lyme disease.

    PubMed

    van der Heijden, Amy; Mulder, Bob C; Poortvliet, P Marijn; van Vliet, Arnold J H

    2017-11-25

    Performing a tick check after visiting nature is considered the most important preventive measure to avoid contracting Lyme disease. Checking the body for ticks after visiting nature is the only measure that can fully guarantee whether one has been bitten by a tick and provides the opportunity to remove the tick as soon as possible, thereby greatly reducing the chance of contracting Lyme disease. However, compliance to performing the tick check is low. In addition, most previous studies on determinants of preventive measures to avoid Lyme disease lack a clear definition and/or operationalization of the term "preventive measures". Those that do distinguish multiple behaviors including the tick check, fail to describe the systematic steps that should be followed in order to perform the tick check effectively. Hence, the purpose of this study was to identify determinants of systematically performing the tick check, based on social cognitive theory. A cross-sectional self-administered survey questionnaire was filled out online by 508 respondents (M age  = 51.7, SD = 16.0; 50.2% men; 86.4% daily or weekly nature visitors). Bivariate correlations and multivariate regression analyses were conducted to identify associations between socio-cognitive determinants (i.e. concepts related to humans' intrinsic and extrinsic motivation to perform certain behavior), and the tick check, and between socio-cognitive determinants and proximal goal to do the tick check. The full regression model explained 28% of the variance in doing the tick check. Results showed that performing the tick check was associated with proximal goal (β = .23, p < 0.01), self-efficacy (β = .22, p < 0.01), self-evaluative outcome expectations (β = .21, p < 0.01), descriptive norm (β = .16, p < 0.01), and experience (β = .13, p < 0.01). Our study is among the first to examine the determinants of systematic performance of the tick check, using an extended version of social cognitive theory to identify determinants. Based on the results, a number of practical recommendations can be made to promote the performance of the tick check.

  1. Prediction Interval Development for Wind-Tunnel Balance Check-Loading

    NASA Technical Reports Server (NTRS)

    Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.

    2014-01-01

    Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.

  2. 75 FR 43801 - Airworthiness Directives; Eurocopter France (ECF) Model EC225LP Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... time. Also, we use inspect rather than check when referring to an action required by a mechanic as... the various levels of government. Therefore, I certify this AD: 1. Is not a ``significant regulatory... compliance time. Also, we use inspect rather than check when referring to an action required by a mechanic as...

  3. Mandatory Identification Bar Checks: How Bouncers Are Doing Their Job

    ERIC Educational Resources Information Center

    Monk-Turner, Elizabeth; Allen, John; Casten, John; Cowling, Catherine; Gray, Charles; Guhr, David; Hoofnagle, Kara; Huffman, Jessica; Mina, Moises; Moore, Brian

    2011-01-01

    The behavior of bouncers at on site establishments that served alcohol was observed. Our aim was to better understand how bouncers went about their job when the bar had a mandatory policy to check identification of all customers. Utilizing an ethnographic decision model, we found that bouncers were significantly more likely to card customers that…

  4. Enhancing Classroom Management Using the Classroom Check-up Consultation Model with In-Vivo Coaching and Goal Setting Components

    ERIC Educational Resources Information Center

    Kleinert, Whitney L.; Silva, Meghan R.; Codding, Robin S.; Feinberg, Adam B.; St. James, Paula S.

    2017-01-01

    Classroom management is essential to promote learning in schools, and as such it is imperative that teachers receive adequate support to maximize their competence implementing effective classroom management strategies. One way to improve teachers' classroom managerial competence is through consultation. The Classroom Check-Up (CCU) is a structured…

  5. 76 FR 18964 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-06

    ... Landing Gear retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on... condition for the specified products. The MCAI states: During Landing Gear retraction/extension ground... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing...

  6. 78 FR 69987 - Airworthiness Directives; Erickson Air-Crane Incorporated Helicopters (Type Certificate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... to require recurring checks of the Blade Inspection Method (BIM) indicator on each blade to determine whether the BIM indicator is signifying that the blade pressure may have been compromised by a blade crack... check procedures for BIM blades installed on the Model S-64E and S-64F helicopters. Several blade spars...

  7. Motivational Interviewing for Effective Classroom Management: The Classroom Check-Up. Practical Intervention in the Schools Series

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Herman, Keith C.; Sprick, Randy

    2011-01-01

    Highly accessible and user-friendly, this book focuses on helping K-12 teachers increase their use of classroom management strategies that work. It addresses motivational aspects of teacher consultation that are essential, yet often overlooked. The Classroom Check-Up is a step-by-step model for assessing teachers' organizational, instructional,…

  8. 75 FR 63045 - Airworthiness Directives; BAE SYSTEMS (OPERATIONS) LIMITED Model BAe 146 and Avro 146-RJ Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... the fitting and wing structure. Checking the nuts with a suitable torque spanner to the specifications in the torque figures shown in Table 2. of the Accomplishment Instructions of BAE SYSTEMS (OPERATIONS... installed, and Doing either an ultrasonic inspection for damaged bolts or torque check of the tension bolts...

  9. 76 FR 13069 - Airworthiness Directives; BAE Systems (Operations) Limited Model ATP Airplanes; BAE Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    ..., an operator found an aileron trim tab hinge pin that had migrated sufficiently to cause a rubbing.... Recently, during a walk round check, an operator found an aileron trim tab hinge pin that had migrated... walk round check, an operator found an aileron trim tab hinge pin that had migrated sufficiently to...

  10. Stochastic Local Search for Core Membership Checking in Hedonic Games

    NASA Astrophysics Data System (ADS)

    Keinänen, Helena

    Hedonic games have emerged as an important tool in economics and show promise as a useful formalism to model multi-agent coalition formation in AI as well as group formation in social networks. We consider a coNP-complete problem of core membership checking in hedonic coalition formation games. No previous algorithms to tackle the problem have been presented. In this work, we overcome this by developing two stochastic local search algorithms for core membership checking in hedonic games. We demonstrate the usefulness of the algorithms by showing experimentally that they find solutions efficiently, particularly for large agent societies.

  11. Robust check loss-based variable selection of high-dimensional single-index varying-coefficient model

    NASA Astrophysics Data System (ADS)

    Song, Yunquan; Lin, Lu; Jian, Ling

    2016-07-01

    Single-index varying-coefficient model is an important mathematical modeling method to model nonlinear phenomena in science and engineering. In this paper, we develop a variable selection method for high-dimensional single-index varying-coefficient models using a shrinkage idea. The proposed procedure can simultaneously select significant nonparametric components and parametric components. Under defined regularity conditions, with appropriate selection of tuning parameters, the consistency of the variable selection procedure and the oracle property of the estimators are established. Moreover, due to the robustness of the check loss function to outliers in the finite samples, our proposed variable selection method is more robust than the ones based on the least squares criterion. Finally, the method is illustrated with numerical simulations.

  12. Model Checking Satellite Operational Procedures

    NASA Astrophysics Data System (ADS)

    Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri

    2011-08-01

    We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.

  13. A general method for calculating three-dimensional compressible laminar and turbulent boundary layers on arbitrary wings

    NASA Technical Reports Server (NTRS)

    Cebeci, T.; Kaups, K.; Ramsey, J. A.

    1977-01-01

    The method described utilizes a nonorthogonal coordinate system for boundary-layer calculations. It includes a geometry program that represents the wing analytically, and a velocity program that computes the external velocity components from a given experimental pressure distribution when the external velocity distribution is not computed theoretically. The boundary layer method is general, however, and can also be used for an external velocity distribution computed theoretically. Several test cases were computed by this method and the results were checked with other numerical calculations and with experiments when available. A typical computation time (CPU) on an IBM 370/165 computer for one surface of a wing which roughly consist of 30 spanwise stations and 25 streamwise stations, with 30 points across the boundary layer is less than 30 seconds for an incompressible flow and a little more for a compressible flow.

  14. Topological properties of the limited penetrable horizontal visibility graph family

    NASA Astrophysics Data System (ADS)

    Wang, Minggang; Vilela, André L. M.; Du, Ruijin; Zhao, Longfeng; Dong, Gaogao; Tian, Lixin; Stanley, H. Eugene

    2018-05-01

    The limited penetrable horizontal visibility graph algorithm was recently introduced to map time series in complex networks. In this work, we extend this algorithm to create a directed-limited penetrable horizontal visibility graph and an image-limited penetrable horizontal visibility graph. We define two algorithms and provide theoretical results on the topological properties of these graphs associated with different types of real-value series. We perform several numerical simulations to check the accuracy of our theoretical results. Finally, we present an application of the directed-limited penetrable horizontal visibility graph to measure real-value time series irreversibility and an application of the image-limited penetrable horizontal visibility graph that discriminates noise from chaos. We also propose a method to measure the systematic risk using the image-limited penetrable horizontal visibility graph, and the empirical results show the effectiveness of our proposed algorithms.

  15. Theoretical and experimental studies of the deposition of Na2So4 from seeded combustion gases

    NASA Technical Reports Server (NTRS)

    Kohl, F. J.; Santoro, G. J.; Stearns, C. A.; Fryburg, G. C.; Rosner, D. E.

    1977-01-01

    Flames in a Mach 0.3 atmospheric pressure laboratory burner rig were doped with sea salt, NaS04, and NaCl, respectively, in an effort to validate theoretical dew point predictions made by a local thermochemical equilibrium (LTCE) method of predicting condensation temperatures of sodium sulfate in flame environments. Deposits were collected on cylindrical platinum targets placed in the combustion products, and the deposition was studied as a function of collector temperature. Experimental deposition onset temperatures checked within experimental error with LTCE-predicted temperatures. A multicomponent mass transfer equation was developed to predict the rate of deposition of Na2SO4(c) via vapor transport at temperatures below the deposition onset temperature. Agreement between maximum deposition rates predicted by this chemically frozen boundary layer (CFBL) theory and those obtained in the seeded laboratory burner experiments is good.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakaguchi, Yuji, E-mail: nkgc2003@yahoo.co.jp; Ono, Takeshi; Onitsuka, Ryota

    COMPASS system (IBA Dosimetry, Schwarzenbruck, Germany) and ArcCHECK with 3DVH software (Sun Nuclear Corp., Melbourne, FL) are commercial quasi-3-dimensional (3D) dosimetry arrays. Cross-validation to compare them under the same conditions, such as a treatment plan, allows for clear evaluation of such measurement devices. In this study, we evaluated the accuracy of reconstructed dose distributions from the COMPASS system and ArcCHECK with 3DVH software using Monte Carlo simulation (MC) for multi-leaf collimator (MLC) test patterns and clinical VMAT plans. In a phantom study, ArcCHECK 3DVH showed clear differences from COMPASS, measurement and MC due to the detector resolution and the dosemore » reconstruction method. Especially, ArcCHECK 3DVH showed 7% difference from MC for the heterogeneous phantom. ArcCHECK 3DVH only corrects the 3D dose distribution of treatment planning system (TPS) using ArcCHECK measurement, and therefore the accuracy of ArcCHECK 3DVH depends on TPS. In contrast, COMPASS showed good agreement with MC for all cases. However, the COMPASS system requires many complicated installation procedures such as beam modeling, and appropriate commissioning is needed. In terms of clinical cases, there were no large differences for each QA device. The accuracy of the compass and ArcCHECK 3DVH systems for phantoms and clinical cases was compared. Both systems have advantages and disadvantages for clinical use, and consideration of the operating environment is important. The QA system selection is depending on the purpose and workflow in each hospital.« less

  17. Fast passage dynamic nuclear polarization on rotating solids.

    PubMed

    Mentink-Vigier, Frederic; Akbey, Umit; Hovav, Yonatan; Vega, Shimon; Oschkinat, Hartmut; Feintuch, Akiva

    2012-11-01

    Magic Angle Spinning (MAS) Dynamic Nuclear Polarization (DNP) has proven to be a very powerful way to improve the signal to noise ratio of NMR experiments on solids. The experiments have in general been interpreted considering the Solid-Effect (SE) and Cross-Effect (CE) DNP mechanisms while ignoring the influence of sample spinning. In this paper, we show experimental data of MAS-DNP enhancements of (1)H and (13)C in proline and SH3 protein in glass forming water/glycerol solvent containing TOTAPOL. We also introduce a theoretical model that aims at explaining how the nuclear polarization is built in MAS-DNP experiments. By using Liouville space based simulations to include relaxation on two simple spin models, {electron-nucleus} and {electron-electron-nucleus}, we explain how the basic MAS-SE-DNP and MAS-CE-DNP processes work. The importance of fast energy passages and short level anti-crossing is emphasized and the differences between static DNP and MAS-DNP is explained. During a single rotor cycle the enhancement in the {electron-electron-nucleus} system arises from MAS-CE-DNP involving at least three kinds of two-level fast passages: an electron-electron dipolar anti-crossing, a single quantum electron MW encounter and an anti-crossing at the CE condition inducing nuclear polarization in- or decrements. Numerical, powder-averaged, simulations were performed in order to check the influence of the experimental parameters on the enhancement efficiencies. In particular we show that the spinning frequency dependence of the theoretical MAS-CE-DNP enhancement compares favorably with the experimental (1)H and (13)C MAS-DNP enhancements of proline and SH3. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Personal Food Systems of Male Collegiate Football Players: A Grounded Theory Investigation

    PubMed Central

    Long, Doug; Perry, Christina; Unruh, Scott A.; Lewis, Nancy; Stanek-Krogstrand, Kaye

    2011-01-01

    Context: Factors that affect food choices include the physical and social environments, quality, quantity, perceived healthfulness, and convenience. The personal food choice process was defined as the procedures used by athletes for making food choices, including the weighing and balancing of activities of daily life, physical well-being, convenience, monetary resources, and social relationships. Objective: To develop a theoretical model explaining the personal food choice processes of collegiate football players. Design: Qualitative study. Setting: National Collegiate Athletic Association Division II football program. Patients or Other Participants: Fifteen football players were purposefully sampled to represent various positions, years of athletic eligibility, and ethnic backgrounds. Data Collection and Analysis: For text data collection, we used predetermined, open-ended questions. Data were analyzed using the constant comparison method. The athletes' words were used to label and describe their interactions and experiences with the food choice process. Member checks and an external audit were conducted by a qualitative methodologist and a nutrition specialist, and the findings were triangulated with the current literature to ensure trustworthiness of the text data. Results: Time was the core category and yielded a cyclic graphic of a theoretical model for the food choice system. Planning hydration, macronutrient strategies, snacks, and healthful food choices emerged as themes. Conclusions: The athletes planned meals and snacks around their academic and athletic schedules while attempting to consume foods identified as healthful. Healthful foods were generally lower in fat but high in preferred macronutrients. High-protein foods were the players' primary goal; carbohydrate consumption was secondary. The athletes had established plans to maintain hydration. Professionals may use these findings to implement educational programs on food choices for football players. PMID:22488196

  19. Model Checking Abstract PLEXIL Programs with SMART

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.

    2007-01-01

    We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.

  20. A conceptual-practice model for occupational therapy to facilitate return to work in breast cancer patients.

    PubMed

    Désiron, Huguette A M; Donceel, Peter; de Rijk, Angelique; Van Hoof, Elke

    2013-12-01

    Improved therapies and early detection have significantly increased the number of breast cancers survivors, leading to increasing needs regarding return to work (RTW). Occupational therapy (OT) interventions provide successful RTW assistance for other conditions, but are not validated in breast cancer. This paper aims to identify a theoretical framework for OT intervention by questioning how OT models can be used in OT interventions in RTW of breast cancer patients; criteria to be used to select these models and adaptations that would be necessary to match the OT model(s) to breast cancer patients' needs? Using research specific criteria derived from OT literature (conceptual OT-model, multidisciplinary, referring to the International Classification of functioning (ICF), RTW in breast cancer) a search in 9 electronic databases was conducted to select articles that describe conceptual OT models. A content analysis of those models complying to at least two of the selection criteria was realised. Checking for breast cancer specific issues, results were matched with literature of care-models regarding RTW in breast cancer. From the nine models initially identified, three [Canadian Model of Occupational Performance, Model of Human Occupation (MOHO), Person-Environment-Occupation-Performance model] were selected based on the selection criteria. The MOHO had the highest compliance rate with the criteria. To enhance usability in breast cancer, some adaptations are needed. No OT model to facilitate RTW in breast cancer could be identified, indicating a need to fill this gap. Individual and societal needs of breast cancer patients can be answered by using a MOHO-based OT model, extended with indications for better treatment, work-outcomes and longitudinal process factors.

  1. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  2. Bounded Parametric Model Checking for Elementary Net Systems

    NASA Astrophysics Data System (ADS)

    Knapik, Michał; Szreter, Maciej; Penczek, Wojciech

    Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.

  3. The instant sequencing task: Toward constraint-checking a complex spacecraft command sequence interactively

    NASA Technical Reports Server (NTRS)

    Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Amador, Arthur V.; Spitale, Joseph N.

    1993-01-01

    Robotic spacecraft are controlled by sets of commands called 'sequences.' These sequences must be checked against mission constraints. Making our existing constraint checking program faster would enable new capabilities in our uplink process. Therefore, we are rewriting this program to run on a parallel computer. To do so, we had to determine how to run constraint-checking algorithms in parallel and create a new method of specifying spacecraft models and constraints. This new specification gives us a means of representing flight systems and their predicted response to commands which could be used in a variety of applications throughout the command process, particularly during anomaly or high-activity operations. This commonality could reduce operations cost and risk for future complex missions. Lessons learned in applying some parts of this system to the TOPEX/Poseidon mission will be described.

  4. Evaluating and Improving a Learning Trajectory for Linear Measurement in Elementary Grades 2 and 3: A Longitudinal Study

    ERIC Educational Resources Information Center

    Barrett, Jeffrey E.; Sarama, Julie; Clements, Douglas H.; Cullen, Craig; McCool, Jenni; Witkowski-Rumsey, Chepina; Klanderman, David

    2012-01-01

    We examined children's development of strategic and conceptual knowledge for linear measurement. We conducted teaching experiments with eight students in grades 2 and 3, based on our hypothetical learning trajectory for length to check its coherence and to strengthen the domain-specific model for learning and teaching. We checked the hierarchical…

  5. 75 FR 39185 - Airworthiness Directives; The Boeing Company Model 747-100, 747-100B, 747-100B SUD, 747-200B, 747...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-08

    ... and torque checks of the hanger fittings and strut forward bulkhead of the forward engine mount and... requires repetitive inspections and torque checks of the hanger fittings and strut forward bulkhead of the... corrective actions are replacing the fasteners; removing loose fasteners; tightening all Group A [[Page 39187...

  6. Visual Predictive Check in Models with Time-Varying Input Function.

    PubMed

    Largajolli, Anna; Bertoldo, Alessandra; Campioni, Marco; Cobelli, Claudio

    2015-11-01

    The nonlinear mixed effects models are commonly used modeling techniques in the pharmaceutical research as they enable the characterization of the individual profiles together with the population to which the individuals belong. To ensure a correct use of them is fundamental to provide powerful diagnostic tools that are able to evaluate the predictive performance of the models. The visual predictive check (VPC) is a commonly used tool that helps the user to check by visual inspection if the model is able to reproduce the variability and the main trend of the observed data. However, the simulation from the model is not always trivial, for example, when using models with time-varying input function (IF). In this class of models, there is a potential mismatch between each set of simulated parameters and the associated individual IF which can cause an incorrect profile simulation. We introduce a refinement of the VPC by taking in consideration a correlation term (the Mahalanobis or normalized Euclidean distance) that helps the association of the correct IF with the individual set of simulated parameters. We investigate and compare its performance with the standard VPC in models of the glucose and insulin system applied on real and simulated data and in a simulated pharmacokinetic/pharmacodynamic (PK/PD) example. The newly proposed VPC performance appears to be better with respect to the standard VPC especially for the models with big variability in the IF where the probability of simulating incorrect profiles is higher.

  7. M-estimator for the 3D symmetric Helmert coordinate transformation

    NASA Astrophysics Data System (ADS)

    Chang, Guobin; Xu, Tianhe; Wang, Qianxin

    2018-01-01

    The M-estimator for the 3D symmetric Helmert coordinate transformation problem is developed. Small-angle rotation assumption is abandoned. The direction cosine matrix or the quaternion is used to represent the rotation. The 3 × 1 multiplicative error vector is defined to represent the rotation estimation error. An analytical solution can be employed to provide the initial approximate for iteration, if the outliers are not large. The iteration is carried out using the iterative reweighted least-squares scheme. In each iteration after the first one, the measurement equation is linearized using the available parameter estimates, the reweighting matrix is constructed using the residuals obtained in the previous iteration, and then the parameter estimates with their variance-covariance matrix are calculated. The influence functions of a single pseudo-measurement on the least-squares estimator and on the M-estimator are derived to theoretically show the robustness. In the solution process, the parameter is rescaled in order to improve the numerical stability. Monte Carlo experiments are conducted to check the developed method. Different cases to investigate whether the assumed stochastic model is correct are considered. The results with the simulated data slightly deviating from the true model are used to show the developed method's statistical efficacy at the assumed stochastic model, its robustness against the deviations from the assumed stochastic model, and the validity of the estimated variance-covariance matrix no matter whether the assumed stochastic model is correct or not.

  8. Model Checker for Java Programs

    NASA Technical Reports Server (NTRS)

    Visser, Willem

    2007-01-01

    Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.

  9. Combining Static Analysis and Model Checking for Software Analysis

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  10. Neighborhood social capital is associated with participation in health checks of a general population: a multilevel analysis of a population-based lifestyle intervention- the Inter99 study.

    PubMed

    Bender, Anne Mette; Kawachi, Ichiro; Jørgensen, Torben; Pisinger, Charlotta

    2015-07-22

    Participation in population-based preventive health check has declined over the past decades. More research is needed to determine factors enhancing participation. The objective of this study was to examine the association between two measures of neighborhood level social capital on participation in the health check phase of a population-based lifestyle intervention. The study population comprised 12,568 residents of 73 Danish neighborhoods in the intervention group of a large population-based lifestyle intervention study - the Inter99. Two measures of social capital were applied; informal socializing and voting turnout. In a multilevel analysis only adjusting for age and sex, a higher level of neighborhood social capital was associated with higher probability of participating in the health check. Inclusion of both individual socioeconomic position and neighborhood deprivation in the model attenuated the coefficients for informal socializing, while voting turnout became non-significant. Higher level of neighborhood social capital was associated with higher probability of participating in the health check phase of a population-based lifestyle intervention. Most of the association between neighborhood social capital and participation in preventive health checks can be explained by differences in individual socioeconomic position and level of neighborhood deprivation. Nonetheless, there seems to be some residual association between social capital and health check participation, suggesting that activating social relations in the community may be an avenue for boosting participation rates in population-based health checks. ClinicalTrials.gov (registration no. NCT00289237 ).

  11. Microscopic analysis and simulation of check-mark stain on the galvanized steel strip

    NASA Astrophysics Data System (ADS)

    So, Hongyun; Yoon, Hyun Gi; Chung, Myung Kyoon

    2010-11-01

    When galvanized steel strip is produced through a continuous hot-dip galvanizing process, the thickness of adhered zinc film is controlled by plane impinging air gas jet referred to as "air-knife system". In such a gas-jet wiping process, stain of check-mark or sag line shape frequently appears. The check-mark defect is caused by non-uniform zinc coating and the oblique patterns such as "W", "V" or "X" on the coated surface. The present paper presents a cause and analysis of the check-mark formation and a numerical simulation of sag lines by using the numerical data produced by Large Eddy Simulation (LES) of the three-dimensional compressible turbulent flow field around the air-knife system. It was found that there is alternating plane-wise vortices near the impinging stagnation region and such alternating vortices move almost periodically to the right and to the left sides on the stagnation line due to the jet flow instability. Meanwhile, in order to simulate the check-mark formation, a novel perturbation model has been developed to predict the variation of coating thickness along the transverse direction. Finally, the three-dimensional zinc coating surface was obtained by the present perturbation model. It was found that the sag line formation is determined by the combination of the instantaneous coating thickness distribution along the transverse direction near the stagnation line and the feed speed of the steel strip.

  12. Predictors of Health Service Utilization Among Older Men in Jamaica.

    PubMed

    Willie-Tyndale, Douladel; McKoy Davis, Julian; Holder-Nevins, Desmalee; Mitchell-Fearon, Kathryn; James, Kenneth; Waldron, Norman K; Eldemire-Shearer, Denise

    2018-01-03

    To determine the relative influence of sociodemographic, socioeconomic, psychosocial, and health variables on health service utilization in the last 12 months. Data were analyzed for 1,412 men ≥60 years old from a 2012 nationally representative community-based survey in Jamaica. Associations between six health service utilization variables and several explanatory variables were explored. Logistic regression models were used to identify independent predictors of each utilization measure and determine the strengths of associations. More than 75% reported having health visits and blood pressure checks. Blood sugar (69.6%) and cholesterol (63.1%) checks were less common, and having a prostate check (35.1%) was the least utilized service. Adjusted models confirmed that the presence of chronic diseases and health insurance most strongly predicted utilization. A daughter or son as the main source of financial support (vs self) doubled or tripled, respectively, the odds of routine doctors' visits. Compared with primary or lower education, tertiary education doubled [2.37 (1.12, 4.95)] the odds of a blood pressure check. Regular attendance at club/society/religious organizations' meetings increased the odds of having a prostate check by 45%. Although need and financial resources most strongly influenced health service utilization, psychosocial variables may be particularly influential for underutilized services. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Neutrino-induced reactions on nuclei

    NASA Astrophysics Data System (ADS)

    Gallmeister, K.; Mosel, U.; Weil, J.

    2016-09-01

    Background: Long-baseline experiments such as the planned deep underground neutrino experiment (DUNE) require theoretical descriptions of the complete event in a neutrino-nucleus reaction. Since nuclear targets are used this requires a good understanding of neutrino-nucleus interactions. Purpose: Develop a consistent theory and code framework for the description of lepton-nucleus interactions that can be used to describe not only inclusive cross sections, but also the complete final state of the reaction. Methods: The Giessen-Boltzmann-Uehling-Uhlenbeck (GiBUU) implementation of quantum-kinetic transport theory is used, with improvements in its treatment of the nuclear ground state and of 2p2h interactions. For the latter an empirical structure function from electron scattering data is used as a basis. Results: Results for electron-induced inclusive cross sections are given as a necessary check for the overall quality of this approach. The calculated neutrino-induced inclusive double-differential cross sections show good agreement data from neutrino and antineutrino reactions for different neutrino flavors at MiniBooNE and T2K. Inclusive double-differential cross sections for MicroBooNE, NOvA, MINERvA, and LBNF/DUNE are given. Conclusions: Based on the GiBUU model of lepton-nucleus interactions a good theoretical description of inclusive electron-, neutrino-, and antineutrino-nucleus data over a wide range of energies, different neutrino flavors, and different experiments is now possible. Since no tuning is involved this theory and code should be reliable also for new energy regimes and target masses.

  14. Common Characteristics of Improvisational Approaches in Music Therapy for Children with Autism Spectrum Disorder: Developing Treatment Guidelines.

    PubMed

    Geretsegger, Monika; Holck, Ulla; Carpente, John A; Elefant, Cochavit; Kim, Jinah; Gold, Christian

    2015-01-01

    Improvisational methods of music therapy have been increasingly applied in the treatment of individuals with autism spectrum disorder (ASD) over the past decades in many countries worldwide. This study aimed at developing treatment guidelines based on the most important common characteristics of improvisational music therapy (IMT) with children affected by ASD as applied across various countries and theoretical backgrounds. After initial development of treatment principle items, a survey among music therapy professionals in 10 countries and focus group workshops with experienced clinicians in three countries were conducted to evaluate the items and formulate revised treatment guidelines. To check usability, a treatment fidelity assessment tool was subsequently used to rate therapy excerpts. Survey findings and feedback from the focus groups corroborated most of the initial principles for IMT in the context of children with ASD. Unique and essential principles include facilitating musical and emotional attunement, musically scaffolding the flow of interaction, and tapping into the shared history of musical interaction between child and therapist. Raters successfully used the tool to evaluate treatment adherence and competence. Summarizing an international consensus about core principles of improvisational approaches in music therapy for children with ASD, these treatment guidelines may be applied in diverse theoretical models of music therapy. They can be used to assess treatment fidelity, and may be applied to facilitate future research, clinical practice, and training. © the American Music Therapy Association 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  16. Vehicular traffic noise prediction using soft computing approach.

    PubMed

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A generalized statistical model for the size distribution of wealth

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  18. Location contexts of user check-ins to model urban geo life-style patterns.

    PubMed

    Hasan, Samiul; Ukkusuri, Satish V

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.

  19. Constraint-Based Abstract Semantics for Temporal Logic: A Direct Approach to Design and Implementation

    NASA Astrophysics Data System (ADS)

    Banda, Gourinath; Gallagher, John P.

    interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal μ-calculus, which is the basis for abstract model checking. The abstract semantic function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract model checking. Using the modal μ-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using abstract domains based on linear constraints. The implementation of the abstract semantic function makes use of an SMT solver. We describe an implemented system for proving properties of linear hybrid automata and give some experimental results.

  20. Lattice density functional theory for confined Ising fluids: comparison between different functional approximations in slit pore

    NASA Astrophysics Data System (ADS)

    Chen, Xueqian; Feng, Wei; Liu, Honglai; Hu, Ying

    2016-09-01

    In this paper, Lafuente and Cuesta's cluster density functional theory (CDFT) and lattice mean field approximation (LMFA) are formulated and compared within the framework of lattice density functional theory (LDFT). As a comparison, an LDFT based on our previous work on nonrandom correction to LMFA is also developed, where local density approximation is adopted on the correction. The numerical results of density distributions of an Ising fluid confined in a slit pore obtained from Monte Carlo simulation are used to check these functional approximations. Due to rational treatment on the coupling between site-excluding entropic effect and contact-attracting enthalpic effect by CDFT with Bethe-Peierls approximation (named as BPA-CDFT for short), the improvement of BPA-CDFT beyond LMFA is checked as expected. And it is interesting that our LDFT has a comparative accuracy with BPA-CDFT. Apparent differences between the profiles such as solvation force, excess adsorption quantity and interfacial tension from LMFA and non-LMFAs are found in our calculations. We also discuss some possible theoretical extensions of BPA-CDFT.

  1. 76 FR 477 - Airworthiness Directives; Bombardier, Inc. Model CL-600-2A12 (CL-601) and CL-600-2B16 (CL-601-3A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... to these aircraft if Bombardier Service Bulletin (SB) 601-0590 [Scheduled Maintenance Instructions... information: Challenger 601 Time Limits/Maintenance Checks, PSP 601-5, Revision 38, dated June 19, 2009. Challenger 601 Time Limits/Maintenance Checks, PSP 601A-5, Revision 34, dated June 19, 2009. Challenger 604...

  2. New Quantum Key Distribution Scheme Based on Random Hybrid Quantum Channel with EPR Pairs and GHZ States

    NASA Astrophysics Data System (ADS)

    Yan, Xing-Yu; Gong, Li-Hua; Chen, Hua-Ying; Zhou, Nan-Run

    2018-05-01

    A theoretical quantum key distribution scheme based on random hybrid quantum channel with EPR pairs and GHZ states is devised. In this scheme, EPR pairs and tripartite GHZ states are exploited to set up random hybrid quantum channel. Only one photon in each entangled state is necessary to run forth and back in the channel. The security of the quantum key distribution scheme is guaranteed by more than one round of eavesdropping check procedures. It is of high capacity since one particle could carry more than two bits of information via quantum dense coding.

  3. A new LDPC decoding scheme for PDM-8QAM BICM coherent optical communication system

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Zhang, Wen-bo; Xi, Li-xia; Tang, Xian-feng; Zhang, Xiao-guang

    2015-11-01

    A new log-likelihood ratio (LLR) message estimation method is proposed for polarization-division multiplexing eight quadrature amplitude modulation (PDM-8QAM) bit-interleaved coded modulation (BICM) optical communication system. The formulation of the posterior probability is theoretically analyzed, and the way to reduce the pre-decoding bit error rate ( BER) of the low density parity check (LDPC) decoder for PDM-8QAM constellations is presented. Simulation results show that it outperforms the traditional scheme, i.e., the new post-decoding BER is decreased down to 50% of that of the traditional post-decoding algorithm.

  4. Two new Controlled not Gate Based Quantum Secret Sharing Protocols without Entanglement Attenuation

    NASA Astrophysics Data System (ADS)

    Zhu, Zhen-Chao; Hu, Ai-Qun; Fu, An-Min

    2016-05-01

    In this paper, we propose two new controlled not gate based quantum secret sharing protocols. In these two protocols, each photon only travels once, which guarantees the agents located in long distance can be able to derive the dealer's secret without suffering entanglement attenuation problem. The protocols are secure against trojan horse attack, intercept-resend attack, entangle-measure attack and entanglement-swapping attack. The theoretical efficiency for qubits of these two protocols can approach 100 %, except those used for eavesdropping checking, all entangled states can be used for final secret sharing.

  5. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  6. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  7. Risk factors of learning disabilities in Chinese children in Wuhan.

    PubMed

    Yao, Bin; Wu, Han-Rong

    2003-12-01

    To investigate prevalence rate of learning disabilities (LD) in Chinese children, and to explore related risk factors, and to provide theoretical basis for preventing such disabilities. One thousand and one hundred fifty one children were randomly selected in primary schools. According to criteria set by ICD-10, 118 children diagnosed as LD were classified into the study group. Four hundred and ninety one children were classified into the normal control group. Five hundred and forty two children were classified into the excellent control group. The study instruments included PRS (The pupil rating scale revised screening for learning disabilities), Conners' children behavior check-list taken by parents and YG-WR character check-list. The prevalence rate of LD in Chinese children was 10.3%. Significant differences were observed between LD and normally learning children, and between the LD group and the excellent group, in terms of scores of Conners' behavior check-list (P < 0.05). The study further showed that individual differences in character between the LD group and the control groups still existed even after controlling individual differences in age, IQ, and gender. Some possible causal explanations contributing to LD were improper teaching by parents, low educational level of the parents, and children's characteristics and social relationships. These data underscore the fact that LD is a serious national public health problem in China. LD is resulted from a number of factors. Good studying and living environments should be created for LD children.

  8. Modelling vertical error in LiDAR-derived digital elevation models

    NASA Astrophysics Data System (ADS)

    Aguilar, Fernando J.; Mills, Jon P.; Delgado, Jorge; Aguilar, Manuel A.; Negreiros, J. G.; Pérez, José L.

    2010-01-01

    A hybrid theoretical-empirical model has been developed for modelling the error in LiDAR-derived digital elevation models (DEMs) of non-open terrain. The theoretical component seeks to model the propagation of the sample data error (SDE), i.e. the error from light detection and ranging (LiDAR) data capture of ground sampled points in open terrain, towards interpolated points. The interpolation methods used for infilling gaps may produce a non-negligible error that is referred to as gridding error. In this case, interpolation is performed using an inverse distance weighting (IDW) method with the local support of the five closest neighbours, although it would be possible to utilize other interpolation methods. The empirical component refers to what is known as "information loss". This is the error purely due to modelling the continuous terrain surface from only a discrete number of points plus the error arising from the interpolation process. The SDE must be previously calculated from a suitable number of check points located in open terrain and assumes that the LiDAR point density was sufficiently high to neglect the gridding error. For model calibration, data for 29 study sites, 200×200 m in size, belonging to different areas around Almeria province, south-east Spain, were acquired by means of stereo photogrammetric methods. The developed methodology was validated against two different LiDAR datasets. The first dataset used was an Ordnance Survey (OS) LiDAR survey carried out over a region of Bristol in the UK. The second dataset was an area located at Gador mountain range, south of Almería province, Spain. Both terrain slope and sampling density were incorporated in the empirical component through the calibration phase, resulting in a very good agreement between predicted and observed data (R2 = 0.9856 ; p < 0.001). In validation, Bristol observed vertical errors, corresponding to different LiDAR point densities, offered a reasonably good fit to the predicted errors. Even better results were achieved in the more rugged morphology of the Gador mountain range dataset. The findings presented in this article could be used as a guide for the selection of appropriate operational parameters (essentially point density in order to optimize survey cost), in projects related to LiDAR survey in non-open terrain, for instance those projects dealing with forestry applications.

  9. Combining Static Model Checking with Dynamic Enforcement Using the Statecall Policy Language

    NASA Astrophysics Data System (ADS)

    Madhavapeddy, Anil

    Internet protocols encapsulate a significant amount of state, making implementing the host software complex. In this paper, we define the Statecall Policy Language (SPL) which provides a usable middle ground between ad-hoc coding and formal reasoning. It enables programmers to embed automata in their code which can be statically model-checked using SPIN and dynamically enforced. The performance overheads are minimal, and the automata also provide higher-level debugging capabilities. We also describe some practical uses of SPL by describing the automata used in an SSH server written entirely in OCaml/SPL.

  10. String tensions in deformed Yang-Mills theory

    NASA Astrophysics Data System (ADS)

    Poppitz, Erich; Shalchian T., M. Erfan

    2018-01-01

    We study k-strings in deformed Yang-Mills (dYM) with SU(N) gauge group in the semiclassically calculable regime on R^3× S^1 . Their tensions Tk are computed in two ways: numerically, for 2 ≤ N ≤ 10, and via an analytic approach using a re-summed perturbative expansion. The latter serves both as a consistency check on the numerical results and as a tool to analytically study the large-N limit. We find that dYM k-string ratios Tk/T1 do not obey the well-known sine- or Casimir-scaling laws. Instead, we show that the ratios Tk/T1 are bound above by a square root of Casimir scaling, previously found to hold for stringlike solutions of the MIT Bag Model. The reason behind this similarity is that dYM dynamically realizes, in a theoretically controlled setting, the main model assumptions of the Bag Model. We also compare confining strings in dYM and in other four-dimensional theories with abelian confinement, notably Seiberg-Witten theory, and show that the unbroken Z_N center symmetry in dYM leads to different properties of k-strings in the two theories; for example, a "baryon vertex" exists in dYM but not in softly-broken Seiberg-Witten theory. Our results also indicate that, at large values of N, k-strings in dYM do not become free.

  11. Quasi-static modeling of human limb for intra-body communications with experiments.

    PubMed

    Pun, Sio Hang; Gao, Yue Ming; Mak, PengUn; Vai, Mang I; Du, Min

    2011-11-01

    In recent years, the increasing number of wearable devices on human has been witnessed as a trend. These devices can serve for many purposes: personal entertainment, communication, emergency mission, health care supervision, delivery, etc. Sharing information among the devices scattered across the human body requires a body area network (BAN) and body sensor network (BSN). However, implementation of the BAN/BSN with the conventional wireless technologies cannot give optimal result. It is mainly because the high requirements of light weight, miniature, energy efficiency, security, and less electromagnetic interference greatly limit the resources available for the communication modules. The newly developed intra-body communication (IBC) can alleviate most of the mentioned problems. This technique, which employs the human body as a communication channel, could be an innovative networking method for sensors and devices on the human body. In order to encourage the research and development of the IBC, the authors are favorable to lay a better and more formal theoretical foundation on IBC. They propose a multilayer mathematical model using volume conductor theory for galvanic coupling IBC on a human limb with consideration on the inhomogeneous properties of human tissue. By introducing and checking with quasi-static approximation criteria, Maxwell's equations are decoupled and capacitance effect is included to the governing equation for further improvement. Finally, the accuracy and potential of the model are examined from both in vitro and in vivo experimental results.

  12. Symbolic Analysis of Concurrent Programs with Polymorphism

    NASA Technical Reports Server (NTRS)

    Rungta, Neha Shyam

    2010-01-01

    The current trend of multi-core and multi-processor computing is causing a paradigm shift from inherently sequential to highly concurrent and parallel applications. Certain thread interleavings, data input values, or combinations of both often cause errors in the system. Systematic verification techniques such as explicit state model checking and symbolic execution are extensively used to detect errors in such systems [7, 9]. Explicit state model checking enumerates possible thread schedules and input data values of a program in order to check for errors [3, 9]. To partially mitigate the state space explosion from data input values, symbolic execution techniques substitute data input values with symbolic values [5, 7, 6]. Explicit state model checking and symbolic execution techniques used in conjunction with exhaustive search techniques such as depth-first search are unable to detect errors in medium to large-sized concurrent programs because the number of behaviors caused by data and thread non-determinism is extremely large. We present an overview of abstraction-guided symbolic execution for concurrent programs that detects errors manifested by a combination of thread schedules and data values [8]. The technique generates a set of key program locations relevant in testing the reachability of the target locations. The symbolic execution is then guided along these locations in an attempt to generate a feasible execution path to the error state. This allows the execution to focus in parts of the behavior space more likely to contain an error.

  13. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    PubMed

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  14. A review and evaluation of the internal structure and consistency of the Approaches to Teaching Inventory

    NASA Astrophysics Data System (ADS)

    Harshman, Jordan; Stains, Marilyne

    2017-05-01

    This study presents a review from 39 studies that provide evidence for the structural validity and internal consistency of the Approaches to Teaching Inventory (ATI). In addition to this review, we evaluate many alternative factor structures on a sample of 267 first- and second-year chemistry faculty members participating in a professional development, a sample of instructors for which the ATI was originally designed. A total of 26 unique factor structures were evaluated. Through robust checking of assumptions, compilations of existing evidence, and new exploratory and confirmatory analyses, we found that there is greater evidence for the structural validity and internal consistency for the 22-item ATI than the 16-item ATI. Additionally, evidence supporting the original two-factor and four-factor structures proposed by the ATI authors (focusing on information transmission and conceptual change) were not reproducible and while alternative models were empirically viable, more theoretical justification is warranted. Recommendations for ATI use and general comments regarding best practices of reporting psychometrics in educational research contexts are discussed.

  15. A critique of the official report on the evacuation of the World Trade Center: continued doubts.

    PubMed

    Hotchkiss, H Lawrence; Aguirre, B E; Best, Eric

    2013-10-01

    This paper criticises the conclusions and the unanswered questions in the National Institute of Standards and Technology (NIST)'s official report on the evacuation of the World Trade Center in New York City, United States, on 11 September 2001. It reviews the extent to which the report disregards several conventional statistical methods and comments on the NIST's refusal to share the machine-readable data file with the scientific community for replication and further analysis. Problems lie in the sampling methods employed, the treatment of missing data, the use of ordinary least squares (OLS) with binary dependent variables, the failure to document the scalability of the scales used, the lack of tests to check for constant error variance, and the absence of overall fit tests of the model. There are also conceptual and theoretical issues, such as the absence in the report of considerations of the influence of group-level processes and their impact on the collective behaviour of evacuating collectivities. © 2013 The Author(s). Disasters © Overseas Development Institute, 2013.

  16. Radiation noise of the bearing applied to the ceramic motorized spindle based on the sub-source decomposition method

    NASA Astrophysics Data System (ADS)

    Bai, X. T.; Wu, Y. H.; Zhang, K.; Chen, C. Z.; Yan, H. P.

    2017-12-01

    This paper mainly focuses on the calculation and analysis on the radiation noise of the angular contact ball bearing applied to the ceramic motorized spindle. The dynamic model containing the main working conditions and structural parameters is established based on dynamic theory of rolling bearing. The sub-source decomposition method is introduced in for the calculation of the radiation noise of the bearing, and a comparative experiment is adopted to check the precision of the method. Then the comparison between the contribution of different components is carried out in frequency domain based on the sub-source decomposition method. The spectrum of radiation noise of different components under various rotation speeds are used as the basis of assessing the contribution of different eigenfrequencies on the radiation noise of the components, and the proportion of friction noise and impact noise is evaluated as well. The results of the research provide the theoretical basis for the calculation of bearing noise, and offers reference to the impact of different components on the radiation noise of the bearing under different rotation speed.

  17. Hexagonal pattern instabilities in rotating Rayleigh-Bénard convection of a non-Boussinesq fluid: experimental results.

    PubMed

    Guarino, Alessio; Vidal, Valerie

    2004-06-01

    Motivated by the Küppers-Lortz instability of roll patterns in the presence of rotation, we have investigated the effects of rotation on a hexagonal pattern in Rayleigh-Bénard convection. While several theoretical models have been developed, experimental data cannot be found in the literature. In order to check the validity of the predictions and to study the effects of rotation on the behavior of the system, we present experimental results for a non-Boussinesq Rayleigh-Bénard convection with rotation about the vertical axis. Rotation introduces an additional control parameter, namely the dimensionless rotation rate Omega= 2 pi f d(2)/nu, where f is the rotation rate (in Hz), d is the thickness of the cell, and nu is the kinematic viscosity. We observe that the cell rotation induces a slow rotation of the pattern in the opposite direction (approximately Omega x 10(-4) ) in the rotating frame. Moreover, it tends to destroy the convective pattern. No oscillation of the hexagonal pattern over the range of its existence (Omega< or =6) has been observed.

  18. A Pilot Study of Ion - Molecule Reactions at Temperatures Relevant to the Atmosphere of Titan.

    PubMed

    Zymak, Illia; Žabka, Ján; Polášek, Miroslav; Španěl, Patrik; Smith, David

    2016-11-01

    Reliable theoretical models of the chemical kinetics of the ionosphere of Saturn's moon, Titan, is highly dependent on the precision of the rates of the reactions of ambient ions with hydrocarbon molecules at relevant temperatures. A Variable Temperature Selected Ions Flow Tube technique, which has been developed primarily to study these reactions at temperatures within the range of 200-330 K, is briefly described. The flow tube temperature regulation system and the thermalisation of ions are also discussed. Preliminary studies of two reactions have been carried out to check the reliability and efficacy of kinetics measurements: (i) Rate constants of the reaction of CH 3 + ions with molecular oxygen were measured at different temperatures, which indicate values in agreement with previous ion cyclotron resonance measurements ostensibly made at 300 K. (ii) Formation of CH 3 + ions in the reaction of N 2 + ions with CH 4 molecules were studied at temperatures within the range 240-310 K which showed a small but statistically significant decrease of the ratio of product CH 3 + ions to reactant N 2 + ions with reaction temperature.

  19. Our experience in the evaluation of the thermal comfort during the space flight and in the simulated space environment

    NASA Astrophysics Data System (ADS)

    Novák, Ludvik

    The paper presents the results of the mathematical modelling the effects of hypogravity on the heat output by the spontaneous convection. The theoretical considerations were completed by the experiments "HEAT EXCHANGE 1" performed on the biosatellite "KOSMOS 936". In the second experiment "HEAT EXCHANGE 2" acomplished on the board of the space laboratory "SALYUT 6" was studied the effect of the microgravity on the thermal state of a man during the space flight. Direct measurement in weightlessness prowed the capacity of the developed electric dynamic katathermometer to check directly the effect of the microgravity on the heat output by the spontaneous convection. The role of the heat partition impairment's in man as by the microgravity, so by the inadequate forced convection are clearly expressed in changes of the skin temperature and the subjective feeling of the cosmonaut's thermal comfort. The experimental extension of the elaborated methods for the flexible adjustment of the thermal environment to the actual physiological needs of man and suggestions for the further investigation are outlined.

  20. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  1. Neighborhood deprivation is strongly associated with participation in a population-based health check.

    PubMed

    Bender, Anne Mette; Kawachi, Ichiro; Jørgensen, Torben; Pisinger, Charlotta

    2015-01-01

    We sought to examine whether neighborhood deprivation is associated with participation in a large population-based health check. Such analyses will help answer the question whether health checks, which are designed to meet the needs of residents in deprived neighborhoods, may increase participation and prove to be more effective in preventing disease. In Europe, no study has previously looked at the association between neighborhood deprivation and participation in a population-based health check. The study population comprised 12,768 persons invited for a health check including screening for ischemic heart disease and lifestyle counseling. The study population was randomly drawn from a population of 179,097 persons living in 73 neighborhoods in Denmark. Data on neighborhood deprivation (percentage with basic education, with low income and not in work) and individual socioeconomic position were retrieved from national administrative registers. Multilevel regression analyses with log links and binary distributions were conducted to obtain relative risks, intraclass correlation coefficients and proportional change in variance. Large differences between neighborhoods existed in both deprivation levels and neighborhood health check participation rate (mean 53%; range 35-84%). In multilevel analyses adjusted for age and sex, higher levels of all three indicators of neighborhood deprivation and a deprivation score were associated with lower participation in a dose-response fashion. Persons living in the most deprived neighborhoods had up to 37% decreased probability of participating compared to those living in the least deprived neighborhoods. Inclusion of individual socioeconomic position in the model attenuated the neighborhood deprivation coefficients, but all except for income deprivation remained statistically significant. Neighborhood deprivation was associated with participation in a population-based health check in a dose-response manner, in which increasing neighborhood deprivation was associated with decreasing participation. This suggests the need to develop preventive health checks tailored to deprived neighborhoods.

  2. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  3. CheckMyMetal: a macromolecular metal-binding validation tool

    PubMed Central

    Porebski, Przemyslaw J.

    2017-01-01

    Metals are essential in many biological processes, and metal ions are modeled in roughly 40% of the macromolecular structures in the Protein Data Bank (PDB). However, a significant fraction of these structures contain poorly modeled metal-binding sites. CheckMyMetal (CMM) is an easy-to-use metal-binding site validation server for macromolecules that is freely available at http://csgid.org/csgid/metal_sites. The CMM server can detect incorrect metal assignments as well as geometrical and other irregularities in the metal-binding sites. Guidelines for metal-site modeling and validation in macromolecules are illustrated by several practical examples grouped by the type of metal. These examples show CMM users (and crystallographers in general) problems they may encounter during the modeling of a specific metal ion. PMID:28291757

  4. Method and system to perform energy-extraction based active noise control

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul (Inventor); Joshi, Suresh M. (Inventor)

    2009-01-01

    A method to provide active noise control to reduce noise and vibration in reverberant acoustic enclosures such as aircraft, vehicles, appliances, instruments, industrial equipment and the like is presented. A continuous-time multi-input multi-output (MIMO) state space mathematical model of the plant is obtained via analytical modeling and system identification. Compensation is designed to render the mathematical model passive in the sense of mathematical system theory. The compensated system is checked to ensure robustness of the passive property of the plant. The check ensures that the passivity is preserved if the mathematical model parameters are perturbed from nominal values. A passivity-based controller is designed and verified using numerical simulations and then tested. The controller is designed so that the resulting closed-loop response shows the desired noise reduction.

  5. Model Selection Methods for Mixture Dichotomous IRT Models

    ERIC Educational Resources Information Center

    Li, Feiming; Cohen, Allan S.; Kim, Seock-Ho; Cho, Sun-Joo

    2009-01-01

    This study examines model selection indices for use with dichotomous mixture item response theory (IRT) models. Five indices are considered: Akaike's information coefficient (AIC), Bayesian information coefficient (BIC), deviance information coefficient (DIC), pseudo-Bayes factor (PsBF), and posterior predictive model checks (PPMC). The five…

  6. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    PubMed

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  7. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    PubMed Central

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  8. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    PubMed

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  9. 75 FR 36298 - Airworthiness Directives; McDonnell Douglas Corporation Model DC-8-31, DC-8-32, DC-8-33, DC-8-41...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-25

    ... Airworthiness Limitations inspections (ALIs). This proposed AD results from a design review of the fuel tank...,'' and also adds ALI 30-1 for a pneumatic system decay check to minimize the risk of hot air impingement... 5, 2010, adds ALI 28-1, ``DC-8 Alternate and Center Auxiliary Tank Fuel Pump Control Systems Check...

  10. Development of a Model of Soldier Effectiveness: Retranslation Materials and Results

    DTIC Science & Technology

    1987-05-01

    covering financial responsibility, particularly the family checking account . Consequent- ly, the bad check rate for the unit drop- ped from 70 a month...Alcohol, and Aggressive Acts " Showing prudence in financial management and responsibility in personal/family matters; avoiding alcohol and other drugs or...threatening others, etc. versus " Acting irresponsibly in financial or personal/family affairs such that command time is required to counsel or otherwise

  11. Increasing dimension of structures by 4D printing shape memory polymers via fused deposition modeling

    NASA Astrophysics Data System (ADS)

    Hu, G. F.; Damanpack, A. R.; Bodaghi, M.; Liao, W. H.

    2017-12-01

    The main objective of this paper is to introduce a 4D printing method to program shape memory polymers (SMPs) during fabrication process. Fused deposition modeling (FDM) as a filament-based printing method is employed to program SMPs during depositing the material. This method is implemented to fabricate complicated polymeric structures by self-bending features without need of any post-programming. Experiments are conducted to demonstrate feasibility of one-dimensional (1D)-to 2D and 2D-to-3D self-bending. It is shown that 3D printed plate structures can transform into masonry-inspired 3D curved shell structures by simply heating. Good reliability of SMP programming during printing process is also demonstrated. A 3D macroscopic constitutive model is established to simulate thermo-mechanical features of the printed SMPs. Governing equations are also derived to simulate programming mechanism during printing process and shape change of self-bending structures. In this respect, a finite element formulation is developed considering von-Kármán geometric nonlinearity and solved by implementing iterative Newton-Raphson scheme. The accuracy of the computational approach is checked with experimental results. It is demonstrated that the theoretical model is able to replicate the main characteristics observed in the experiments. This research is likely to advance the state of the art FDM 4D printing, and provide pertinent results and computational tool that are instrumental in design of smart materials and structures with self-bending features.

  12. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Tolson, Bryan

    2017-04-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  13. Sub-pixel analysis to support graphic security after scanning at low resolution

    NASA Astrophysics Data System (ADS)

    Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve

    2006-02-01

    Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced by the illegitimate process.

  14. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  15. Space shuttle prototype check valve development

    NASA Technical Reports Server (NTRS)

    Tellier, G. F.

    1976-01-01

    Contaminant-resistant seal designs and a dynamically stable prototype check valve for the orbital maneuvering and reaction control helium pressurization systems of the space shuttle were developed. Polymer and carbide seal models were designed and tested. Perfluoroelastomers compatible with N2O4 and N2H4 types were evaluated and compared with Teflon in flat and captive seal models. Low load sealing and contamination resistance tests demonstrated cutter seal superiority over polymer seals. Ceramic and carbide materials were evaluated for N2O4 service using exposure to RFNA as a worst case screen; chemically vapor deposited tungsten carbide was shown to be impervious to the acid after 6 months immersion. A unique carbide shell poppet/cutter seat check valve was designed and tested to demonstrate low cracking pressure ( 2.0 psid), dynamic stability under all test bench flow conditions, contamination resistance (0.001 inch CRES wires cut with 1.5 pound seat load) and long life of 100,000 cycles (leakage 1.0 scc/hr helium from 0.1 to 400 psig).

  16. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  17. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  18. Measurement of the cross section of the residues from the 11B-induced reaction on 89Y and 93Nb: Production of 97Ru and Rhm101

    NASA Astrophysics Data System (ADS)

    Kumar, Deepak; Maiti, Moumita

    2017-06-01

    Background: The heavy-ion induced reactions on intermediate mass targets are complex in nature, even at the low energies. To understand those nuclear reaction phenomena in detail, more experimental studies are required in a wide range of energies. Purpose: Investigation of heavy-ion reactions by measuring production cross sections of the residues produced in the 11B-induced reactions on 89Y and 93Nb at low energies, near and above the barrier, and to check the effectiveness of the different nuclear models to explain them. Further, aim is also to optimize the production parameters of neutron deficient medically relevant 97Ru and Rhm101 radioisotopes produced in those reactions, respectively. Method: The 11B beam was allowed to impinge on 89Y and 93Nb foils supported by an aluminum (Al) catcher foil, arranged in a stack, in 27.5-58.7 and 30.6-62.3 MeV energy range, respectively. The off-line γ -ray spectrometry was carried out after the end of bombardment to measure the activity of the radionuclides produced in each foil and cross sections were calculated. Measured cross-sectional data were analyzed in terms of compound and precompound model calculations. Results: The measured cross sections of Ru,9597, 96,95,94Tc, Mom93, Ym90 radionuclides produced in the 11B+89Y reaction, and 101,100,99Pd, 101m,100,99mRh, 97Ru produced in the 11B+93Nb reaction showed good agreement with the model calculations based on the Hauser-Feshbach formulation and exciton model. Unlike theoretical estimation, consistent production of Ym90 was observed in the 11B+89Y reaction. Substantial pre-equilibrium contribution was noticed in the 3 n reaction channel in both reactions. Conclusions: Theoretical estimations confirmed that major production yields are mostly contributed by the compound reaction process. Pre-equilibrium emissions contributed at the high energy tail of the 3 n channel for both reactions. Moreover, an indirect signature of a direct reaction influence was also observed in the Ym90 production.

  19. SU-F-T-300: Impact of Electron Density Modeling of ArcCHECK Cylindricaldiode Array On 3DVH Patient Specific QA Software Tool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patwe, P; Mhatre, V; Dandekar, P

    Purpose: 3DVH software is a patient specific quality assurance tool which estimates the 3D dose to the patient specific geometry with the help of Planned Dose Perturbation algorithm. The purpose of this study is to evaluate the impact of HU value of ArcCHECK phantom entered in Eclipse TPS on 3D dose & DVH QA analysis. Methods: Manufacturer of ArcCHECK phantom provides CT data set of phantom & recommends considering it as a homogeneous phantom with electron density (1.19 gm/cc or 282 HU) close to PMMA. We performed this study on Eclipse TPS (V13, VMS) & trueBEAM STx VMS Linac &more » ArcCHECK phantom (SNC). Plans were generated for 6MV photon beam, 20cm×20cm field size at isocentre & SPD (Source to phantom distance) of 86.7 cm to deliver 100cGy at isocentre. 3DVH software requires patients DICOM data generated by TPS & plan delivered on ArcCHECK phantom. Plans were generated in TPS by assigning different HU values to phantom. We analyzed gamma index & the dose profile for all plans along vertical down direction of beam’s central axis for Entry, Exit & Isocentre dose. Results: The global gamma passing rate (2% & 2mm) for manufacturer recommended HU value 282 was 96.3%. Detector entry, Isocentre & detector exit Doses were 1.9048 (1.9270), 1.00(1.0199) & 0.5078(0.527) Gy for TPS (Measured) respectively.The global gamma passing rate for electron density 1.1302 gm/cc was 98.6%. Detector entry, Isocentre & detector exit Doses were 1.8714 (1.8873), 1.00(0.9988) & 0.5211(0.516) Gy for TPS (Measured) respectively. Conclusion: Electron density value assigned by manufacturer does not hold true for every user. Proper modeling of electron density of ArcCHECK in TPS is essential to avoid systematic error in dose calculation of patient specific QA.« less

  20. Experimental and Theoretical Studies of Volatile Metal Hydroxides

    NASA Technical Reports Server (NTRS)

    Myers, Dwight L.; Jacobson, Nathan S.

    2015-01-01

    Modern superalloys used in the construction of turbomachinery contain a wide range of metals in trace quantities. In addition, metal oxides and silicon dioxide are used to form Thermal Barrier Coatings (TBC) to protect the underlying metal in turbine blades. Formation of volatile hydroxides at elevated temperatures is an important mechanism for corrosion of metal alloys or oxides in combustion environments (N. Jacobson, D. Myers, E. Opila, and E. Copland, J. Phys. Chem. Solids 66, 471-478, 2005). Thermodynamic data is essential to proper design of components of modern gas turbines. It is necessary to first establish the identity of volatile hydroxides formed from the reaction of a given system with high temperature water vapor, and then to determine the equilibrium pressures of the species under operating conditions. Theoretical calculations of reaction energies are an important check of experimental results. This presentation reports results for several important systems: Si-O-H, Cr-O-H, Al-O-H, Ti-O-H, and ongoing studies of Ta-O-H.

  1. Study of the electronic structure of electron accepting cyano-films: TCNQversusTCNE.

    PubMed

    Capitán, Maria J; Álvarez, Jesús; Navio, Cristina

    2018-04-18

    In this article, we perform systematic research on the electronic structure of two closely related organic electron acceptor molecules (TCNQ and TCNE), which are of technological interest due to their outstanding electronic properties. These studies have been performed from the experimental point of view by the use electron spectroscopies (XPS and UPS) and supported theoretically by the use of ab-initio DFT calculations. The cross-check between both molecules allows us to identify the characteristic electronic features of each part of the molecules and their contribution to the final electronic structure. We can describe the nature of the band gap of these materials, and we relate this with the appearance of the shake-up features in the core level spectra. A band bending and energy gap reduction of the aforementioned electronic structure in contact with a metal surface are seen in the experimental results as well in the theoretical calculations. This behavior implies that the TCNQ thin film accepts electrons from the metal substrate becoming a Schottky n-junction.

  2. MALDI-MS analysis and theoretical evaluation of olanzapine as a UV laser desorption ionization (LDI) matrix.

    PubMed

    Musharraf, Syed Ghulam; Ameer, Mariam; Ali, Arslan

    2017-01-05

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) being soft ionization technique, has become a method of choice for high-throughput analysis of proteins and peptides. In this study, we have explored the potential of atypical anti-psychotic drug olanzapine (OLZ) as a matrix for MALDI-MS analysis of peptides aided with the theoretical studies. Seven small peptides were employed as target analytes to check performance of olanzapine and compared with conventional MALDI matrix α-cyano-4-hydroxycinnamic acid (HCCA). All peptides were successfully detected when olanzapine was used as a matrix. Moreover, peptides angiotensin Ι and angiotensin ΙΙ were detected with better S/N ratio and resolution with this method as compared to their analysis by HCCA. Computational studies were performed to determine the thermochemical properties of olanzapine in order to further evaluate its similarity to MALDI matrices which were found in good agreement with the data of existing MALDI matrices. Copyright © 2016. Published by Elsevier B.V.

  3. Exploring the theoretical pathways through which asthma app features can promote adolescent self-management.

    PubMed

    Carpenter, Delesha M; Geryk, Lorie L; Sage, Adam; Arrindell, Courtney; Sleath, Betsy L

    2016-12-01

    Asthma apps often lack strong theoretical underpinnings. We describe how specific features of asthma apps influenced adolescents' self-observation, self-judgment, and self-reactions, which are key constructs of Self-Regulation Theory (SRT). Adolescents (ages 12-16) with persistent asthma (n = 20) used two asthma self-management apps over a 1-week period. During semi-structured interviews, participants identified their asthma goals and the app features that best promoted self-observation, self-judgment, and fostered positive self-reactions. Interviews were digitally recorded, transcribed verbatim, and analyzed thematically using MAXQDA. Adolescents' goals were to reduce the impact of asthma on their lives. Adolescents reported that self-check quizzes, reminders, and charting features increased their ability to self-observe and self-judge their asthma, which, in turn, helped them feel more confident they could manage their asthma independently and keep their asthma well-controlled. Asthma apps can positively influence adolescents' self-management behaviors via increased self-observation, self-judgment, and increased self-efficacy.

  4. Assembly of a check-patterned CuSx-TiO2 film with an electron-rich pool and its application for the photoreduction of carbon dioxide to methane

    NASA Astrophysics Data System (ADS)

    Lee, Homin; Kwak, Byeong Sub; Park, No-Kuk; Baek, Jeom-In; Ryu, Ho-Jung; Kang, Misook

    2017-01-01

    A new check-patterned CuSx-TiO2 film was designed to improve the photoreduction of CO2 to CH4. The check-patterned CuSx-TiO2 film with a 3D-network microstructure was fabricated by a facile squeeze method. The as-synthesized TiO2 and CuSx powders, as well as the patterned film, were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), X-ray photoelectron spectroscopy (XPS), UV-visible spectroscopy, cyclic voltammetry (CV), and photoluminescence (PL) spectroscopy, as well as photocurrent density and CO2 temperature-programmed desorption (TPD) measurements. Compared to pure CuSx and TiO2, the check-patterned CuSx-TiO2 film exhibited significantly increased adsorption of CO2 on its networked microstructure, attributed to the enlarged interfaces between the microparticles. The check-patterned CuSx-TiO2 film exhibited superior photocatalytic behavior, with 53.2 μmolgcat-1 L-1 of CH4 produced after 8 h of reaction, whereas 18.1 and 7.3 μmolgcat-1 L-1 of CH4 were produced from pure TiO2 and CuSx films under the same reaction conditions, respectively. A model for enhanced photoactivity over the check-patterned CuSx - TiO2 film was proposed. Results indicated that the check-patterned CuS-TiO2 material is quite promising as a photocatalyst for the reduction of CO2 to CH4.

  5. Formal Validation of Fault Management Design Solutions

    NASA Technical Reports Server (NTRS)

    Gibson, Corrina; Karban, Robert; Andolfato, Luigi; Day, John

    2013-01-01

    The work presented in this paper describes an approach used to develop SysML modeling patterns to express the behavior of fault protection, test the model's logic by performing fault injection simulations, and verify the fault protection system's logical design via model checking. A representative example, using a subset of the fault protection design for the Soil Moisture Active-Passive (SMAP) system, was modeled with SysML State Machines and JavaScript as Action Language. The SysML model captures interactions between relevant system components and system behavior abstractions (mode managers, error monitors, fault protection engine, and devices/switches). Development of a method to implement verifiable and lightweight executable fault protection models enables future missions to have access to larger fault test domains and verifiable design patterns. A tool-chain to transform the SysML model to jpf-Statechart compliant Java code and then verify the generated code via model checking was established. Conclusions and lessons learned from this work are also described, as well as potential avenues for further research and development.

  6. Using chemical organization theory for model checking

    PubMed Central

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053

  7. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  8. Location Contexts of User Check-Ins to Model Urban Geo Life-Style Patterns

    PubMed Central

    Hasan, Samiul; Ukkusuri, Satish V.

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items—either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430

  9. Dynamic modeling and simulation of an integral bipropellant propulsion double-valve combined test system

    NASA Astrophysics Data System (ADS)

    Chen, Yang; Wang, Huasheng; Xia, Jixia; Cai, Guobiao; Zhang, Zhenpeng

    2017-04-01

    For the pressure reducing regulator and check valve double-valve combined test system in an integral bipropellant propulsion system, a system model is established with modular models of various typical components. The simulation research is conducted on the whole working process of an experiment of 9 MPa working condition from startup to rated working condition and finally to shutdown. Comparison of simulation results with test data shows: five working conditions including standby, startup, rated pressurization, shutdown and halt and nine stages of the combined test system are comprehensively disclosed; valve-spool opening and closing details of the regulator and two check valves are accurately revealed; the simulation also clarifies two phenomena which test data are unable to clarify, one is the critical opening state in which the check valve spools slightly open and close alternately in their own fully closed positions, the other is the obvious effects of flow-field temperature drop and temperature rise in pipeline network with helium gas flowing. Moreover, simulation results with consideration of component wall heat transfer are closer to the test data than those under the adiabatic-wall condition, and more able to reveal the dynamic characteristics of the system in various working stages.

  10. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  11. Bayesian truncation errors in chiral effective field theory: model checking and accounting for correlations

    NASA Astrophysics Data System (ADS)

    Melendez, Jordan; Wesolowski, Sarah; Furnstahl, Dick

    2017-09-01

    Chiral effective field theory (EFT) predictions are necessarily truncated at some order in the EFT expansion, which induces an error that must be quantified for robust statistical comparisons to experiment. A Bayesian model yields posterior probability distribution functions for these errors based on expectations of naturalness encoded in Bayesian priors and the observed order-by-order convergence pattern of the EFT. As a general example of a statistical approach to truncation errors, the model was applied to chiral EFT for neutron-proton scattering using various semi-local potentials of Epelbaum, Krebs, and Meißner (EKM). Here we discuss how our model can learn correlation information from the data and how to perform Bayesian model checking to validate that the EFT is working as advertised. Supported in part by NSF PHY-1614460 and DOE NUCLEI SciDAC DE-SC0008533.

  12. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  13. Modelling of influential parameters on a continuous evaporation process by Doehlert shells

    PubMed Central

    Porte, Catherine; Havet, Jean-Louis; Daguet, David

    2003-01-01

    The modelling of the parameters that influence the continuous evaporation of an alcoholic extract was considered using Doehlert matrices. The work was performed with a wiped falling film evaporator that allowed us to study the influence of the pressure, temperature, feed flow and dry matter of the feed solution on the dry matter contents of the resulting concentrate, and the productivity of the process. The Doehlert shells were used to model the influential parameters. The pattern obtained from the experimental results was checked allowing for some dysfunction in the unit. The evaporator was modified and a new model applied; the experimental results were then in agreement with the equations. The model was finally determined and successfully checked in order to obtain an 8% dry matter concentrate with the best productivity; the results fit in with the industrial constraints of subsequent processes. PMID:18924887

  14. Re-engineering pre-employment check-up systems: a model for improving health services.

    PubMed

    Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin

    2011-01-01

    The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.

  15. Development of a check sheet for collecting information necessary for occupational safety and health activities and building relevant systems in overseas business places.

    PubMed

    Kajiki, Shigeyuki; Kobayashi, Yuichi; Uehara, Masamichi; Nakanishi, Shigemoto; Mori, Koji

    2016-06-07

    This study aimed to develop an information gathering check sheet to efficiently collect information necessary for Japanese companies to build global occupational safety and health management systems in overseas business places. The study group consisted of 2 researchers with occupational physician careers in a foreign-affiliated company in Japan and 3 supervising occupational physicians who were engaged in occupational safety and health activities in overseas business places. After investigating information and sources of information necessary for implementing occupational safety and health activities and building relevant systems, we conducted information acquisition using an information gathering check sheet in the field, by visiting 10 regions in 5 countries (first phase). The accuracy of the information acquired and the appropriateness of the information sources were then verified in study group meetings to improve the information gathering check sheet. Next, the improved information gathering check sheet was used in another setting (3 regions in 1 country) to confirm its efficacy (second phase), and the information gathering check sheet was thereby completed. The information gathering check sheet was composed of 9 major items (basic information on the local business place, safety and health overview, safety and health systems, safety and health staff, planning/implementation/evaluation/improvement, safety and health activities, laws and administrative organs, local medical care systems and public health, and medical support for resident personnel) and 61 medium items. We relied on the following eight information sources: the internet, company (local business place and head office in Japan), embassy/consulate, ISO certification body, university or other educational institutions, and medical institutions (aimed at Japanese people or at local workers). Through multiple study group meetings and a two-phased field survey (13 regions in 6 countries), an information gathering check sheet was completed. We confirmed the possibility that this check sheet would enable the user to obtain necessary information when expanding safety and health activities in a country or region that is new to the user. It is necessary in the future to evaluate safety and health systems and activities using this information gathering check sheet in a local business place in any country in which a Japanese business will be established, and to verify the efficacy of the check sheet by conducting model programs to test specific approaches.

  16. Simulation of the MELiSSA closed loop system as a tool to define its integration strategy

    NASA Astrophysics Data System (ADS)

    Poughon, Laurent; Farges, Berangere; Dussap, Claude-Gilles; Godia, Francesc; Lasseur, Christophe

    Inspired from a terrestrial ecosystem, MELiSSA (Micro Ecological Life Support System Alternative) is a project of closed life support system future long-term manned missions (Moon and Mars bases). Started on ESA in 1989, this 5 compartments concept has evolved following a mechanistic engineering approach for acquiring both theoretical and technical knowledge. In its current state of development the project can now start to demonstrate the MELiSSA loop concept at a pilot scale. Thus an integration strategy for a MELiSSA Pilot Plant (MPP) was defined, describing the different phases for tests and connections between compartments. The integration steps should be started in 2008 and be completed with a complete operational loop in 2015, which final objective is to achieve a closed liquid and gas loop with 100 Although the integration logic could start with the most advanced processes in terms of knowledge and hardware development, this logic needs to be completed by high politic of simulation. Thanks to this simulation exercise, the effective demonstrations of each independent process and its progressive coupling with others will be performed in operational conditions as close as possible to the final configuration. The theoretical approach described in this paper is based on mass balance models of each of the MELiSSA biological compartments which are used to simulate each integration step and the complete MPP loop itself. These simulations will help to identify criticalities of each integration steps and to check the consistencies between objectives, flows, recycling efficiencies and sizing of the pilot reactors. A MPP scenario compatible with the current knowledge of the operation of the pilot reactors was investigated and the theoretical performances of the system compared to the objectives of the MPP. From this scenario the most important milestone steps in the integration are highlighted and their behaviour can be simulated.

  17. An algorithm for hyperspectral remote sensing of aerosols: 1. Development of theoretical framework

    NASA Astrophysics Data System (ADS)

    Hou, Weizhen; Wang, Jun; Xu, Xiaoguang; Reid, Jeffrey S.; Han, Dong

    2016-07-01

    This paper describes the first part of a series of investigations to develop algorithms for simultaneous retrieval of aerosol parameters and surface reflectance from a newly developed hyperspectral instrument, the GEOstationary Trace gas and Aerosol Sensor Optimization (GEO-TASO), by taking full advantage of available hyperspectral measurement information in the visible bands. We describe the theoretical framework of an inversion algorithm for the hyperspectral remote sensing of the aerosol optical properties, in which major principal components (PCs) for surface reflectance is assumed known, and the spectrally dependent aerosol refractive indices are assumed to follow a power-law approximation with four unknown parameters (two for real and two for imaginary part of refractive index). New capabilities for computing the Jacobians of four Stokes parameters of reflected solar radiation at the top of the atmosphere with respect to these unknown aerosol parameters and the weighting coefficients for each PC of surface reflectance are added into the UNified Linearized Vector Radiative Transfer Model (UNL-VRTM), which in turn facilitates the optimization in the inversion process. Theoretical derivations of the formulas for these new capabilities are provided, and the analytical solutions of Jacobians are validated against the finite-difference calculations with relative error less than 0.2%. Finally, self-consistency check of the inversion algorithm is conducted for the idealized green-vegetation and rangeland surfaces that were spectrally characterized by the U.S. Geological Survey digital spectral library. It shows that the first six PCs can yield the reconstruction of spectral surface reflectance with errors less than 1%. Assuming that aerosol properties can be accurately characterized, the inversion yields a retrieval of hyperspectral surface reflectance with an uncertainty of 2% (and root-mean-square error of less than 0.003), which suggests self-consistency in the inversion framework. The next step of using this framework to study the aerosol information content in GEO-TASO measurements is also discussed.

  18. Directed Bak-Sneppen Model for Food Chains

    NASA Astrophysics Data System (ADS)

    Stauffer, D.; Jan, N.

    A modification of the Bak-Sneppen model to include simple elements of Darwinian evolution is used to check the survival of prey and predators in long food chains. Mutations, selection, and starvation resulting from depleted prey are incorporated in this model.

  19. Examining Moderation Analyses in Propensity Score Methods: Application to Depression and Substance Use

    PubMed Central

    Green, Kerry M.; Stuart, Elizabeth A.

    2014-01-01

    Objective This study provides guidance on how propensity score methods can be combined with moderation analyses (i.e., effect modification) to examine subgroup differences in potential causal effects in non-experimental studies. As a motivating example, we focus on how depression may affect subsequent substance use differently for men and women. Method Using data from a longitudinal community cohort study (N=952) of urban African Americans with assessments in childhood, adolescence, young adulthood and midlife, we estimate the influence of depression by young adulthood on substance use outcomes in midlife, and whether that influence varies by gender. We illustrate and compare five different techniques for estimating subgroup effects using propensity score methods, including separate propensity score models and matching for men and women, a joint propensity score model for men and women with matching separately and together by gender, and a joint male/female propensity score model that includes theoretically important gender interactions with matching separately and together by gender. Results Analyses showed that estimating separate models for men and women yielded the best balance and, therefore, is a preferred technique when subgroup analyses are of interest, at least in this data. Results also showed substance use consequences of depression but no significant gender differences. Conclusions It is critical to prespecify subgroup effects before the estimation of propensity scores and to check balance within subgroups regardless of the type of propensity score model used. Results also suggest that depression may affect multiple substance use outcomes in midlife for both men and women relatively equally. PMID:24731233

  20. Detecting Inconsistencies in Multi-View Models with Variability

    NASA Astrophysics Data System (ADS)

    Lopez-Herrejon, Roberto Erick; Egyed, Alexander

    Multi-View Modeling (MVM) is a common modeling practice that advocates the use of multiple, different and yet related models to represent the needs of diverse stakeholders. Of crucial importance in MVM is consistency checking - the description and verification of semantic relationships amongst the views. Variability is the capacity of software artifacts to vary, and its effective management is a core tenet of the research in Software Product Lines (SPL). MVM has proven useful for developing one-of-a-kind systems; however, to reap the potential benefits of MVM in SPL it is vital to provide consistency checking mechanisms that cope with variability. In this paper we describe how to address this need by applying Safe Composition - the guarantee that all programs of a product line are type safe. We evaluate our approach with a case study.

  1. Approximate Model Checking of PCTL Involving Unbounded Path Properties

    NASA Astrophysics Data System (ADS)

    Basu, Samik; Ghosh, Arka P.; He, Ru

    We study the problem of applying statistical methods for approximate model checking of probabilistic systems against properties encoded as PCTL formulas. Such approximate methods have been proposed primarily to deal with state-space explosion that makes the exact model checking by numerical methods practically infeasible for large systems. However, the existing statistical methods either consider a restricted subset of PCTL, specifically, the subset that can only express bounded until properties; or rely on user-specified finite bound on the sample path length. We propose a new method that does not have such restrictions and can be effectively used to reason about unbounded until properties. We approximate probabilistic characteristics of an unbounded until property by that of a bounded until property for a suitably chosen value of the bound. In essence, our method is a two-phase process: (a) the first phase is concerned with identifying the bound k 0; (b) the second phase computes the probability of satisfying the k 0-bounded until property as an estimate for the probability of satisfying the corresponding unbounded until property. In both phases, it is sufficient to verify bounded until properties which can be effectively done using existing statistical techniques. We prove the correctness of our technique and present its prototype implementations. We empirically show the practical applicability of our method by considering different case studies including a simple infinite-state model, and large finite-state models such as IPv4 zeroconf protocol and dining philosopher protocol modeled as Discrete Time Markov chains.

  2. Sediment depositions upstream of open check dams: new elements from small scale models

    NASA Astrophysics Data System (ADS)

    Piton, Guillaume; Le Guern, Jules; Carbonari, Costanza; Recking, Alain

    2015-04-01

    Torrent hazard mitigation remains a big issue in mountainous regions. In steep slope streams and especially in their fan part, torrential floods mainly result from abrupt and massive sediment deposits. To curtail such phenomenon, soil conservation measures as well as torrent control works have been undertaken for decades. Since the 1950s, open check dams complete other structural and non-structural measures in watershed scale mitigation plans1. They are often built to trap sediments near the fan apexes. The development of earthmoving machinery after the WWII facilitated the dredging operations of open check dams. Hundreds of these structures have thus been built for 60 years. Their design evolved with the improving comprehension of torrential hydraulics and sediment transport; however this kind of structure has a general tendency to trap most of the sediments supplied by the headwaters. Secondary effects as channel incision downstream of the traps often followed an open check dam creation. This sediment starvation trend tends to propagate to the main valley rivers and to disrupt past geomorphic equilibriums. Taking it into account and to diminish useless dredging operation, a better selectivity of sediment trapping must be sought in open check dams, i.e. optimal open check dams would trap sediments during dangerous floods and flush them during normal small floods. An accurate description of the hydraulic and deposition processes that occur in sediment traps is needed to optimize existing structures and to design best-adjusted new structures. A literature review2 showed that if design criteria exist for the structure itself, little information is available on the dynamic of the sediment depositions upstream of open check dams, i.e. what are the geomorphic patterns that occur during the deposition?, what are the relevant friction laws and sediment transport formula that better describe massive depositions in sediment traps?, what are the range of Froude and Shields numbers that the flows tend to adopt? New small scale model experiments have been undertaken focusing on depositions processes and their related hydraulics. Accurate photogrammetric measurements allowed us to better describe the deposition processes3. Large Scale Particle Image Velocimetry (LS-PIV) was performed to determine surface velocity fields in highly active channels with low grain submersion4. We will present preliminary results of our experiments showing the new elements we observed in massive deposit dynamics. REFERENCES 1.Armanini, A., Dellagiacoma, F. & Ferrari, L. From the check dam to the development of functional check dams. Fluvial Hydraulics of Mountain Regions 37, 331-344 (1991). 2.Piton, G. & Recking, A. Design of sediment traps with open check dams: a review, part I: hydraulic and deposition processes. (Accepted by the) Journal of Hydraulic Engineering 1-23 (2015). 3.Le Guern, J. Ms Thesis: Modélisation physique des plages de depot : analyse de la dynamique de remplissage.(2014) . 4.Carbonari, C. Ms Thesis: Small scale experiments of deposition processes occuring in sediment traps, LS-PIV measurments and geomorphological descriptions. (in preparation).

  3. Experience Report: A Do-It-Yourself High-Assurance Compiler

    NASA Technical Reports Server (NTRS)

    Pike, Lee; Wegmann, Nis; Niller, Sebastian; Goodloe, Alwyn

    2012-01-01

    Embedded domain-specific languages (EDSLs) are an approach for quickly building new languages while maintaining the advantages of a rich metalanguage. We argue in this experience report that the "EDSL approach" can surprisingly ease the task of building a high-assurance compiler.We do not strive to build a fully formally-verified tool-chain, but take a "do-it-yourself" approach to increase our confidence in compiler-correctness without too much effort. Copilot is an EDSL developed by Galois, Inc. and the National Institute of Aerospace under contract to NASA for the purpose of runtime monitoring of flight-critical avionics. We report our experience in using type-checking, QuickCheck, and model-checking "off-the-shelf" to quickly increase confidence in our EDSL tool-chain.

  4. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  5. A finite element algorithm for high-lying eigenvalues with Neumann and Dirichlet boundary conditions

    NASA Astrophysics Data System (ADS)

    Báez, G.; Méndez-Sánchez, R. A.; Leyvraz, F.; Seligman, T. H.

    2014-01-01

    We present a finite element algorithm that computes eigenvalues and eigenfunctions of the Laplace operator for two-dimensional problems with homogeneous Neumann or Dirichlet boundary conditions, or combinations of either for different parts of the boundary. We use an inverse power plus Gauss-Seidel algorithm to solve the generalized eigenvalue problem. For Neumann boundary conditions the method is much more efficient than the equivalent finite difference algorithm. We checked the algorithm by comparing the cumulative level density of the spectrum obtained numerically with the theoretical prediction given by the Weyl formula. We found a systematic deviation due to the discretization, not to the algorithm itself.

  6. Theoretical study of optical activity of 1:1 hydrogen bond complexes of water with S-warfarin

    NASA Astrophysics Data System (ADS)

    Dadsetani, Mehrdad; Abdolmaleki, Ahmad; Zabardasti, Abedin

    2016-11-01

    The molecular interaction between S-warfarin (SW) and a single water molecule was investigated using the B3LYP method at 6-311 ++G(d,p) basis set. The vibrational spectra of the optimized complexes have been investigated for stabilization checking. Quantum theories of atoms in molecules, natural bond orbitals, molecular electrostatic potentials and energy decomposition analysis methods have been applied to analyze the intermolecular interactions. The intermolecular charge transfer in the most stable complex is in the opposite direction from those in the other complexes. The optical spectra and the hyperpolarizabilities of SW-water hydrogen bond complexes have been computed.

  7. Development of Theoretical Foundations for Description and Analysis of Discrete Information Systems

    DTIC Science & Technology

    1975-05-07

    on work of M. Hack , M.W. Marean, J.M, Myers, and P.M. Shapiro. What is presented is an introduction to a body of methods related to Pngmatic...Program FrPPU910 which creates the Account Validation file (FFPFDS20) from input cards without contacting any other files in the system FP-910 The...34•"■ •""^•"’J«^ *• ^’’•"’»’">* ^’*», *’" \\’, A6. S18 FPU PDS 10 Rcqulslclo.) Master S20 FFPFDS20 Account Validation (used to check thai charges

  8. FIBER AND INTEGRATED OPTICS: Optical anisotropy induced in a round trip through single-mode optical waveguides and methods for suppression of this anisotropy

    NASA Astrophysics Data System (ADS)

    Gelikonov, V. M.; Leonov, V. I.; Novikov, M. A.

    1989-09-01

    An analysis is made of the characteristics of the transformation of the polarization of light in the course of a round trip in a single-mode fiber waveguide. The Poincaré equivalence theorems are generalized for a round trip through such fibers. An investigation is reported of round-trip anisotropic properties which can be used to compensate for a regular and an irregular anisotropy of a fiber waveguide. A description is given of a compensation system containing a Faraday cell and an experimental check of the theoretical conclusions is reported.

  9. Control of the electromagnetic drag using fluctuating light fields

    NASA Astrophysics Data System (ADS)

    Pastor, Víctor J. López; Marqués, Manuel I.

    2018-05-01

    An expression for the electromagnetic drag force experienced by an electric dipole in a light field consisting of a monochromatic plane wave with polarization and phase randomly fluctuating is obtained. The expression explicitly considers the transformations of the field and frequency due to the Doppler shift and the change of the polarizability response of the electric dipole. The conditions to be fulfilled by the polarizability of the dipole in order to obtain a positive, a null, and a negative drag coefficient are analytically determined and checked against numerical simulations for the dynamics of a silver nanoparticle. The theoretically predicted diffusive, superdiffusive, and exponentially accelerated dynamical regimes are numerically confirmed.

  10. What can Robots Do? Towards Theoretical Analysis

    NASA Technical Reports Server (NTRS)

    Nogueira, Monica

    1997-01-01

    Robots have become more and more sophisticated. Every robot has its limits. If we face a task that existing robots cannot solve, then, before we start improving these robots, it is important to check whether it is, in principle, possible to design a robot for this task or not. For that, it is necessary to describe what exactly the robots can, in principle, do. A similar problem - to describe what exactly computers can do - has been solved as early as 1936, by Turing. In this paper, we describe a framework within which we can, hopefully, formalize and answer the question of what exactly robots can do.

  11. Test of Hadronic Interaction Models with the KASCADE Hadron Calorimeter

    NASA Astrophysics Data System (ADS)

    Milke, J.; KASCADE Collaboration

    The interpretation of extensive air shower (EAS) measurements often requires the comparison with EAS simulations based on high-energy hadronic interaction models. These interaction models have to extrapolate into kinematical regions and energy ranges beyond the limit of present accelerators. Therefore, it is necessary to test whether these models are able to describe the EAS development in a consistent way. By measuring simultaneously the hadronic, electromagnetic, and muonic part of an EAS the experiment KASCADE offers best facilities for checking the models. For the EAS simulations the program CORSIKA with several hadronic event generators implemented is used. Different hadronic observables, e.g. hadron number, energy spectrum, lateral distribution, are investigated, as well as their correlations with the electromagnetic and muonic shower size. By comparing measurements and simulations the consistency of the description of the EAS development is checked. First results with the new interaction model NEXUS and the version II.5 of the model DPMJET, recently included in CORSIKA, are presented and compared with QGSJET simulations.

  12. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  13. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  14. The effects of an online basic life support course on undergraduate nursing students' learning.

    PubMed

    Tobase, Lucia; Peres, Heloisa H C; Gianotto-Oliveira, Renan; Smith, Nicole; Polastri, Thatiane F; Timerman, Sergio

    2017-08-25

    To describe learning outcomes of undergraduate nursing students following an online basic life support course (BLS). An online BLS course was developed and administered to 94 nursing students. Pre- and post-tests were used to assess theoretical learning. Checklist simulations and feedback devices were used to assess the cardiopulmonary resuscitation (CPR) skills of the 62 students who completed the course. A paired t-test revealed a significant increase in learning [pre-test (6.4 ± 1.61), post-test (9.3 ± 0.82), p < 0.001]. The increase in the average grade after taking the online course was significant (p<0.001). No learning differences (p=0.475) had been observed between 1st and 2nd year (9.20 ± 1.60), and between 3rd and 4th year (9.67 ± 0.61) students. A CPR simulation was performed after completing the course: students checked for a response (90%), exposed the chest (98%), checked for breathing (97%), called emergency services (76%), requested for a defibrillator (92%), checked for a pulse (77%), positioned their hands properly (87%), performed 30 compressions/cycle (95%), performed compressions of at least 5 cm depth (89%), released the chest (90%), applied two breaths (97%), used the automated external defibrillator (97%), and positioned the pads (100%). The online course was an effective method for teaching and learning key BLS skills wherein students were able to accurately apply BLS procedures during the CPR simulation. This short-term online training, which likely improves learning and self-efficacy in BLS providers, can be used for the continuing education of health professionals.

  15. A Dual Frequency Carrier Phase Error Difference Checking Algorithm for the GNSS Compass.

    PubMed

    Liu, Shuo; Zhang, Lei; Li, Jian

    2016-11-24

    The performance of the Global Navigation Satellite System (GNSS) compass is related to the quality of carrier phase measurement. How to process the carrier phase error properly is important to improve the GNSS compass accuracy. In this work, we propose a dual frequency carrier phase error difference checking algorithm for the GNSS compass. The algorithm aims at eliminating large carrier phase error in dual frequency double differenced carrier phase measurement according to the error difference between two frequencies. The advantage of the proposed algorithm is that it does not need additional environment information and has a good performance on multiple large errors compared with previous research. The core of the proposed algorithm is removing the geographical distance from the dual frequency carrier phase measurement, then the carrier phase error is separated and detectable. We generate the Double Differenced Geometry-Free (DDGF) measurement according to the characteristic that the different frequency carrier phase measurements contain the same geometrical distance. Then, we propose the DDGF detection to detect the large carrier phase error difference between two frequencies. The theoretical performance of the proposed DDGF detection is analyzed. An open sky test, a manmade multipath test and an urban vehicle test were carried out to evaluate the performance of the proposed algorithm. The result shows that the proposed DDGF detection is able to detect large error in dual frequency carrier phase measurement by checking the error difference between two frequencies. After the DDGF detection, the accuracy of the baseline vector is improved in the GNSS compass.

  16. Including diverging electrostatic potential in 3D-RISM theory: The charged wall case.

    PubMed

    Vyalov, Ivan; Rocchia, Walter

    2018-03-21

    Although three-dimensional site-site molecular integral equations of liquids are a powerful tool of the modern theoretical chemistry, their applications to the problem of characterizing the electrical double layer originating at the solid-liquid interface with a macroscopic substrate are severely limited by the fact that an infinitely extended charged plane generates a divergent electrostatic potential. Such potentials cannot be treated within the standard 3D-Reference Interaction Site Model equation solution framework since it leads to functions that are not Fourier transformable. In this paper, we apply a renormalization procedure to overcome this obstacle. We then check the validity and numerical accuracy of the proposed computational scheme on the prototypical gold (111) surface in contact with water/alkali chloride solution. We observe that despite the proposed method requires, to achieve converged charge densities, a higher spatial resolution than that suited to the estimation of biomolecular solvation with either 3D-RISM or continuum electrostatics approaches, it still is computationally efficient. Introducing the electrostatic potential of an infinite wall, which is periodic in 2 dimensions, we avoid edge effects, permit a robust integration of Poisson's equation, and obtain the 3D electrostatic potential profile for the first time in such calculations. We show that the potential within the electrical double layer presents oscillations which are not grasped by the Debye-Hückel and Gouy-Chapman theories. This electrostatic potential deviates from its average of up to 1-2 V at small distances from the substrate along the lateral directions. Applications of this theoretical development are relevant, for example, for liquid scanning tunneling microscopy imaging.

  17. Including diverging electrostatic potential in 3D-RISM theory: The charged wall case

    NASA Astrophysics Data System (ADS)

    Vyalov, Ivan; Rocchia, Walter

    2018-03-01

    Although three-dimensional site-site molecular integral equations of liquids are a powerful tool of the modern theoretical chemistry, their applications to the problem of characterizing the electrical double layer originating at the solid-liquid interface with a macroscopic substrate are severely limited by the fact that an infinitely extended charged plane generates a divergent electrostatic potential. Such potentials cannot be treated within the standard 3D-Reference Interaction Site Model equation solution framework since it leads to functions that are not Fourier transformable. In this paper, we apply a renormalization procedure to overcome this obstacle. We then check the validity and numerical accuracy of the proposed computational scheme on the prototypical gold (111) surface in contact with water/alkali chloride solution. We observe that despite the proposed method requires, to achieve converged charge densities, a higher spatial resolution than that suited to the estimation of biomolecular solvation with either 3D-RISM or continuum electrostatics approaches, it still is computationally efficient. Introducing the electrostatic potential of an infinite wall, which is periodic in 2 dimensions, we avoid edge effects, permit a robust integration of Poisson's equation, and obtain the 3D electrostatic potential profile for the first time in such calculations. We show that the potential within the electrical double layer presents oscillations which are not grasped by the Debye-Hückel and Gouy-Chapman theories. This electrostatic potential deviates from its average of up to 1-2 V at small distances from the substrate along the lateral directions. Applications of this theoretical development are relevant, for example, for liquid scanning tunneling microscopy imaging.

  18. Quality assurance of weather data for agricultural system model input

    USDA-ARS?s Scientific Manuscript database

    It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...

  19. Rat Models and Identification of Candidate Early Serum Biomarkers of Battlefield Traumatic Brain Injury

    DTIC Science & Technology

    2007-07-31

    brain injury) All surgeries were performed using aseptic technique. Animals were checked for pain /distress immediately prior to anesthesia/surgery... Pain /distress checks were performed at 3, 6, 12, 24, 36, 48, 60, and 72 hours post-injury. Fluid Percussion Injury (FPI) For animals in the...NIH), and Neurobehavioral Scale (NBS). The criteria used to obtain the scores are detailed in Tables 2 and 3. As an additional endpoint, we also

  20. Employing the Intelligence Cycle Process Model Within the Homeland Security Enterprise

    DTIC Science & Technology

    2013-12-01

    the Iraq anti-war movement, a former U.S. Congresswoman, the U.S. Treasury Department and hip hop bands to spread Sharia law in the U.S. A Virginia...challenges remain with threat notification, access to information, and database management of information that may have contributed the 2013 Boston...The FBI said it took a number of investigative steps to check on the request, including looking at his travel history, checking databases for

  1. Modeling and analysis of cell membrane systems with probabilistic model checking

    PubMed Central

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  2. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  3. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  4. 75 FR 53857 - Airworthiness Directives; Eurocopter France Model SA330J Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-02

    ... Airworthiness Directives; Eurocopter France Model SA330J Helicopters AGENCY: Federal Aviation Administration... known U.S. owners and operators of Eurocopter France (Eurocopter) Model SA330J helicopters by individual...'' rather than checking for ``play.'' This helicopter model is manufactured in France and is type...

  5. [Staffing in medical radiation physics in Germany--summary of a questionnaire].

    PubMed

    Leetz, Hans-Karl; Eipper, Hermann Hans; Gfirtner, Hans; Schneider, Peter; Welker, Klaus

    2003-01-01

    To obtain an overview of the actual staffing levels in Medical Radiation Physics, a survey was carried out in 1999 by the task-group "Staffing requirements" of the Deutsche Gesellschaft für Medizinische Physik (DGMP; German Society of Medical Physics) among all DGMP members active in this field. The main components for equipment and activities are defined as in Report 8 and 10 of the DGMP for staffing requirements in Medical Radiation Physics. The survey focused on these main components. Of 322 forms sent out, 173 answers could be evaluated. From the answers regarding equipment and activities, theoretical staff requirements were calculated on the basis of this spot-check target and compared with the effective staffing levels documented in the survey. The spot-check data were then extrapolated to the whole Germany. The calculation revealed a deficit of 72% for the whole physics staff and of 58% for the number of physicists. Considering the age distribution of the DGMP members and the calculated staffing deficit, a training need was calculated of approximately 100 medical physicists per year in Germany, provided that the goal is set of cutting back the deficit in 10 years.

  6. State background checks for gun purchase and firearm deaths: an exploratory study.

    PubMed

    Sen, Bisakha; Panjamapirom, Anantachai

    2012-10-01

    This study examines the relationship between the types of background-information check required by states prior to firearm purchases, and firearm homicide and suicide deaths. Negative binomial models are used to analyze state-level data for homicides and suicides in the U.S. from 1996 to 2005. Data on types of background information are retrieved from the Surveys of State Procedures Related to Firearm Sales, and the violent death data are from the WISQARS. Several other state level factors were controlled for. More background checks are associated with fewer homicide (IRR:0.93, 95% CI:0.91-0.96) and suicide (IRR:0.98, 95% CI:0.96-1.00) deaths. Firearm homicide deaths are lower when states have checks for restraining orders (IRR:0.87, 95% CI:0.79-0.95) and fugitive status (IRR:0.79, 95% CI:0.72-0.88). Firearm suicide deaths are lower when states have background checks for mental illness (IRR:0.96, 95% CI:0.92-0.99), fugitive status (IRR:0.95, 95% CI:0.90-0.99) and misdemeanors (IRR:0.95, 95% CI:0.92-1.00). It does not appear that reductions in firearm deaths are offset by increases in non-firearm violent deaths. More extensive background checks prior to gun purchase are mostly associated with reductions in firearm homicide and suicide deaths. Several study limitations are acknowledged, and further research is called for to ascertain causality. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Theoretical models of parental HIV disclosure: a critical review.

    PubMed

    Qiao, Shan; Li, Xiaoming; Stanton, Bonita

    2013-01-01

    This study critically examined three major theoretical models related to parental HIV disclosure (i.e., the Four-Phase Model [FPM], the Disclosure Decision Making Model [DDMM], and the Disclosure Process Model [DPM]), and the existing studies that could provide empirical support to these models or their components. For each model, we briefly reviewed its theoretical background, described its components and/or mechanisms, and discussed its strengths and limitations. The existing empirical studies supported most theoretical components in these models. However, hypotheses related to the mechanisms proposed in the models have not yet tested due to a lack of empirical evidence. This study also synthesized alternative theoretical perspectives and new issues in disclosure research and clinical practice that may challenge the existing models. The current study underscores the importance of including components related to social and cultural contexts in theoretical frameworks, and calls for more adequately designed empirical studies in order to test and refine existing theories and to develop new ones.

  8. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  9. 'Constraint consistency' at all orders in cosmological perturbation theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandi, Debottam; Shankaranarayanan, S., E-mail: debottam@iisertvm.ac.in, E-mail: shanki@iisertvm.ac.in

    2015-08-01

    We study the equivalence of two—order-by-order Einstein's equation and Reduced action—approaches to cosmological perturbation theory at all orders for different models of inflation. We point out a crucial consistency check which we refer to as 'Constraint consistency' condition that needs to be satisfied in order for the two approaches to lead to identical single variable equation of motion. The method we propose here is quick and efficient to check the consistency for any model including modified gravity models. Our analysis points out an important feature which is crucial for inflationary model building i.e., all 'constraint' inconsistent models have higher ordermore » Ostrogradsky's instabilities but the reverse is not true. In other words, one can have models with constraint Lapse function and Shift vector, though it may have Ostrogradsky's instabilities. We also obtain single variable equation for non-canonical scalar field in the limit of power-law inflation for the second-order perturbed variables.« less

  10. Model Checking JAVA Programs Using Java Pathfinder

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Pressburger, Thomas

    2000-01-01

    This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.

  11. Rewriting Modulo SMT

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.

    2013-01-01

    Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.

  12. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    NASA Technical Reports Server (NTRS)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  13. Development of flank wear model of cutting tool by using adaptive feedback linear control system on machining AISI D2 steel and AISI 4340 steel

    NASA Astrophysics Data System (ADS)

    Orra, Kashfull; Choudhury, Sounak K.

    2016-12-01

    The purpose of this paper is to build an adaptive feedback linear control system to check the variation of cutting force signal to improve the tool life. The paper discusses the use of transfer function approach in improving the mathematical modelling and adaptively controlling the process dynamics of the turning operation. The experimental results shows to be in agreement with the simulation model and error obtained is less than 3%. The state space approach model used in this paper successfully check the adequacy of the control system through controllability and observability test matrix and can be transferred from one state to another by appropriate input control in a finite time. The proposed system can be implemented to other machining process under varying range of cutting conditions to improve the efficiency and observability of the system.

  14. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  15. Modelling and Analysis of the Excavation Phase by the Theory of Blocks Method of Tunnel 4 Kherrata Gorge, Algeria

    NASA Astrophysics Data System (ADS)

    Boukarm, Riadh; Houam, Abdelkader; Fredj, Mohammed; Boucif, Rima

    2017-12-01

    The aim of our work is to check the stability during excavation tunnel work in the rock mass of Kherrata, connecting the cities of Bejaia to Setif. The characterization methods through the Q system (method of Barton), RMR (Bieniawski classification) allowed us to conclude that the quality of rock mass is average in limestone, and poor in fractured limestone. Then modelling of excavation phase using the theory of blocks method (Software UNWEDGE) with the parameters from the recommendations of classification allowed us to check stability and to finally conclude that the use of geomechanical classification and the theory of blocks can be considered reliable in preliminary design.

  16. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  17. Portable Wireless LAN Device and Two-Way Radio Threat Assessment for Aircraft VHF Communication Radio Band

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  18. Cross-sectional review of the response and treatment uptake from the NHS Health Checks programme in Stoke on Trent.

    PubMed

    Cochrane, Thomas; Gidlow, Christopher J; Kumar, Jagdish; Mawby, Yvonne; Iqbal, Zafar; Chambers, Ruth M

    2013-03-01

    As part of national policy to manage the increasing burden of chronic diseases, the Department of Health in England has launched the NHS Health Checks programme, which aims to reduce the burden of the major vascular diseases on the health service. A cross-sectional review of response, attendance and treatment uptake over the first year of the programme in Stoke on Trent was carried out. Patients aged between 32 and 74 years and estimated to be at ≥20% risk of developing cardiovascular disease were identified from electronic medical records. Multi-level regression modelling was used to evaluate the influence of individual- and practice-level factors on health check outcomes. Overall 63.3% of patients responded, 43.7% attended and 29.8% took up a treatment following their health check invitation. The response was higher for older age and more affluent areas; attendance and treatment uptake were higher for males and older age. Variance between practices was significant (P < 0.001) for response (13.4%), attendance (12.7%) and uptake (23%). The attendance rate of 43.7% following invitation to a health check was considerably lower than the benchmark of 75%. The lack of public interest and the prevalence of significant comorbidity are challenges to this national policy innovation.

  19. Model Development for EHR Interdisciplinary Information Exchange of ICU Common Goals

    PubMed Central

    Collins, Sarah A.; Bakken, Suzanne; Vawdrey, David K.; Coiera, Enrico; Currie, Leanne

    2010-01-01

    Purpose Effective interdisciplinary exchange of patient information is an essential component of safe, efficient, and patient–centered care in the intensive care unit (ICU). Frequent handoffs of patient care, high acuity of patient illness, and the increasing amount of available data complicate information exchange. Verbal communication can be affected by interruptions and time limitations. To supplement verbal communication, many ICUs rely on documentation in electronic health records (EHRs) to reduce errors of omission and information loss. The purpose of this study was to develop a model of EHR interdisciplinary information exchange of ICU common goals. Methods The theoretical frameworks of distributed cognition and the clinical communication space were integrated and a previously published categorization of verbal information exchange was used. 59.5 hours of interdisciplinary rounds in a Neurovascular ICU were observed and five interviews and one focus group with ICU nurses and physicians were conducted. Results Current documentation tools in the ICU were not sufficient to capture the nurses' and physicians' collaborative decision-making and verbal communication of goal-directed actions and interactions. Clinicians perceived the EHR to be inefficient for information retrieval, leading to a further reliance on verbal information exchange. Conclusion The model suggests that EHRs should support: 1) Information tools for the explicit documentation of goals, interventions, and assessments with synthesized and summarized information outputs of events and updates; and 2) Messaging tools that support collaborative decision-making and patient safety double checks that currently occur between nurses and physicians in the absence of EHR support. PMID:20974549

  20. Spray algorithm without interface construction

    NASA Astrophysics Data System (ADS)

    Al-Kadhem Majhool, Ahmed Abed; Watkins, A. P.

    2012-05-01

    This research is aimed to create a new and robust family of convective schemes to capture the interface between the dispersed and the carrier phases in a spray without the need to build up the interface boundary. The selection of the Weighted Average Flux (WAF) scheme is due to this scheme being designed to deal with random flux scheme which is second-order accurate in space and time. The convective flux in each cell face utilizes the WAF scheme blended with Switching Technique for Advection and Capturing of Surfaces (STACS) scheme for high resolution flux limiters. In the next step, the high resolution scheme is blended with the WAF scheme to provide the sharpness and boundedness of the interface by using switching strategy. In this work, the Eulerian-Eulerian framework of non-reactive turbulent spray is set in terms of theoretical proposed methodology namely spray moments of drop size distribution, presented by Beck and Watkins [1]. The computational spray model avoids the need to segregate the local droplet number distribution into parcels of identical droplets. The proposed scheme is tested on capturing the spray edges in modelling hollow cone sprays without need to reconstruct two-phase interface. A test is made on simple comparison between TVD scheme and WAF scheme using the same flux limiter on convective flow hollow cone spray. Results show the WAF scheme gives a better prediction than TVD scheme. The only way to check the accuracy of the presented models is by evaluating the spray sheet thickness.

  1. A Self-Stabilizing Distributed Clock Synchronization Protocol for Arbitrary Digraphs

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2011-01-01

    This report presents a self-stabilizing distributed clock synchronization protocol in the absence of faults in the system. It is focused on the distributed clock synchronization of an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. We present an outline of a deductive proof of the correctness of the protocol. A model of the protocol was mechanically verified using the Symbolic Model Verifier (SMV) for a variety of topologies. Results of the mechanical proof of the correctness of the protocol are provided. The model checking results have verified the correctness of the protocol as they apply to the networks with unidirectional and bidirectional links. In addition, the results confirm the claims of determinism and linear convergence. As a result, we conjecture that the protocol solves the general case of this problem. We also present several variations of the protocol and discuss that this synchronization protocol is indeed an emergent system.

  2. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  3. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  4. From Care to Cure: Demonstrating a Model of Clinical Patient Navigation for Hepatitis C Care and Treatment in High-Need Patients.

    PubMed

    Ford, Mary M; Johnson, Nirah; Desai, Payal; Rude, Eric; Laraque, Fabienne

    2017-03-01

    The NYC Department of Health implemented a patient navigation program, Check Hep C, to address patient and provider barriers to HCV care and potentially lifesaving treatment. Services were delivered at two clinical care sites and two sites that linked patients to off-site care. Working with a multidisciplinary care team, patient navigators provided risk assessment, health education, treatment readiness and medication adherence counseling, and medication coordination. Between March 2014 and January 2015, 388 participants enrolled in Check Hep C, 129 (33%) initiated treatment, and 119 (91% of initiators) had sustained virologic response (SVR). Participants receiving on-site clinical care had higher odds of initiating treatment than those linked to off-site care. Check Hep C successfully supported high-need participants through HCV care and treatment, and SVR rates demonstrate the real-world ability of achieving high cure rates using patient navigation care models. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. A full-dimensional multilayer multiconfiguration time-dependent Hartree study on the ultraviolet absorption spectrum of formaldehyde oxide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Qingyong, E-mail: mengqingyong@dicp.ac.cn; Meyer, Hans-Dieter, E-mail: hans-dieter.meyer@pci.uni-heidelberg.de

    2014-09-28

    Employing the multilayer multiconfiguration time-dependent Hartree (ML-MCTDH) method in conjunction with the multistate multimode vibronic coupling Hamiltonian (MMVCH) model, we perform a full dimensional (9D) quantum dynamical study on the simplest Criegee intermediate, formaldehyde oxide, in five lower-lying singlet electronic states. The ultraviolet (UV) spectrum is then simulated by a Fourier transform of the auto-correlation function. The MMVCH model is built based on extensive MRCI(8e,8o)/aug-cc-pVTZ calculations. To ensure a fast convergence of the final calculations, a large number of ML-MCTDH test calculations is performed to find an appropriate multilayer separations (ML-trees) of the ML-MCTDH nuclear wave functions, and the dynamicalmore » calculations are carefully checked to ensure that the calculations are well converged. To compare the computational efficiency, standard MCTDH simulations using the same Hamiltonian are also performed. A comparison of the MCTDH and ML-MCTDH calculations shows that even for the present not-too-large system (9D here) the ML-MCTDH calculations can save a considerable amount of computational resources while producing identical spectra as the MCTDH calculations. Furthermore, the present theoretical B{sup ~} {sup 1}A{sup ′}←X{sup ~} {sup 1}A{sup ′} UV spectral band and the corresponding experimental measurements [J. M. Beames, F. Liu, L. Lu, and M. I. Lester, J. Am. Chem. Soc. 134, 20045–20048 (2012); L. Sheps, J. Phys. Chem. Lett. 4, 4201–4205 (2013); W.-L. Ting, Y.-H. Chen, W. Chao, M. C. Smith, and J. J.-M. Lin, Phys. Chem. Chem. Phys. 16, 10438–10443 (2014)] are discussed. To the best of our knowledge, this is the first theoretical UV spectrum simulated for this molecule including nuclear motion beyond an adiabatic harmonic approximation.« less

  6. NASTRAN data generation and management using interactive graphics

    NASA Technical Reports Server (NTRS)

    Smootkatow, M.; Cooper, B. M.

    1972-01-01

    A method of using an interactive graphics device to generate a large portion of the input bulk data with visual checks of the structure and the card images is described. The generation starts from GRID and PBAR cards. The visual checks result from a three-dimensional display of the model in any rotated position. By detailing the steps, the time saving and cost effectiveness of this method may be judged, and its potential as a useful tool for the structural analyst may be established.

  7. Development of Subischial Prosthetic Sockets with Vacuum-Assisted Suspension for Highly Active Persons with Transfemoral Amputations

    DTIC Science & Technology

    2016-12-01

    of the frame from the combined image files and ensure total contact between the frame geometry, ultimately modeled independently as a solid, and...fitting with a rigid PETG check socket to ensure correct volumes and total contact at the distal end has been achieved, a second check socket can be...from dynamically conforming to changes in residual limb shape and volume during gait (Sanders, 2009). The ensuing separation (i.e. loss of contact

  8. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  9. Comparison of ANN and RKS approaches to model SCC strength

    NASA Astrophysics Data System (ADS)

    Prakash, Aravind J.; Sathyan, Dhanya; Anand, K. B.; Aravind, N. R.

    2018-02-01

    Self compacting concrete (SCC) is a high performance concrete that has high flowability and can be used in heavily reinforced concrete members with minimal compaction segregation and bleeding. The mix proportioning of SCC is highly complex and large number of trials are required to get the mix with the desired properties resulting in the wastage of materials and time. The research on SCC has been highly empirical and no theoretical relationships have been developed between the mixture proportioning and engineering properties of SCC. In this work effectiveness of artificial neural network (ANN) and random kitchen sink algorithm(RKS) with regularized least square algorithm(RLS) in predicting the split tensile strength of the SCC is analysed. Random kitchen sink algorithm is used for mapping data to higher dimension and classification of this data is done using Regularized least square algorithm. The training and testing data for the algorithm was obtained experimentally using standard test procedures and materials available. Total of 40 trials were done which were used as the training and testing data. Trials were performed by varying the amount of fine aggregate, coarse aggregate, dosage and type of super plasticizer and water. Prediction accuracy of the ANN and RKS model is checked by comparing the RMSE value of both ANN and RKS. Analysis shows that eventhough the RKS model is good for large data set, its prediction accuracy is as good as conventional prediction method like ANN so the split tensile strength model developed by RKS can be used in industries for the proportioning of SCC with tailor made property.

  10. Retrospective checking of compliance with practice guidelines for acute stroke care: a novel experiment using openEHR’s Guideline Definition Language

    PubMed Central

    2014-01-01

    Background Providing scalable clinical decision support (CDS) across institutions that use different electronic health record (EHR) systems has been a challenge for medical informatics researchers. The lack of commonly shared EHR models and terminology bindings has been recognised as a major barrier to sharing CDS content among different organisations. The openEHR Guideline Definition Language (GDL) expresses CDS content based on openEHR archetypes and can support any clinical terminologies or natural languages. Our aim was to explore in an experimental setting the practicability of GDL and its underlying archetype formalism. A further aim was to report on the artefacts produced by this new technological approach in this particular experiment. We modelled and automatically executed compliance checking rules from clinical practice guidelines for acute stroke care. Methods We extracted rules from the European clinical practice guidelines as well as from treatment contraindications for acute stroke care and represented them using GDL. Then we executed the rules retrospectively on 49 mock patient cases to check the cases’ compliance with the guidelines, and manually validated the execution results. We used openEHR archetypes, GDL rules, the openEHR reference information model, reference terminologies and the Data Archetype Definition Language. We utilised the open-sourced GDL Editor for authoring GDL rules, the international archetype repository for reusing archetypes, the open-sourced Ocean Archetype Editor for authoring or modifying archetypes and the CDS Workbench for executing GDL rules on patient data. Results We successfully represented clinical rules about 14 out of 19 contraindications for thrombolysis and other aspects of acute stroke care with 80 GDL rules. These rules are based on 14 reused international archetypes (one of which was modified), 2 newly created archetypes and 51 terminology bindings (to three terminologies). Our manual compliance checks for 49 mock patients were a complete match versus the automated compliance results. Conclusions Shareable guideline knowledge for use in automated retrospective checking of guideline compliance may be achievable using GDL. Whether the same GDL rules can be used for at-the-point-of-care CDS remains unknown. PMID:24886468

  11. SimCheck: An Expressive Type System for Simulink

    NASA Technical Reports Server (NTRS)

    Roy, Pritam; Shankar, Natarajan

    2010-01-01

    MATLAB Simulink is a member of a class of visual languages that are used for modeling and simulating physical and cyber-physical systems. A Simulink model consists of blocks with input and output ports connected using links that carry signals. We extend the type system of Simulink with annotations and dimensions/units associated with ports and links. These types can capture invariants on signals as well as relations between signals. We define a type-checker that checks the wellformedness of Simulink blocks with respect to these type annotations. The type checker generates proof obligations that are solved by SRI's Yices solver for satisfiability modulo theories (SMT). This translation can be used to detect type errors, demonstrate counterexamples, generate test cases, or prove the absence of type errors. Our work is an initial step toward the symbolic analysis of MATLAB Simulink models.

  12. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  13. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  14. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  15. Formal Verification of the Runway Safety Monitor

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu; Ciardo, Gianfranco

    2006-01-01

    The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce runway accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems.

  16. Bases of creation of new concept in global tectonics

    NASA Astrophysics Data System (ADS)

    Anokhin, Vladimir

    2014-05-01

    With the accumulation of new facts about the structure of the Earth existing plate paradigm is becoming more doubtful. In fact, it is supported by the opinion of the majority specialist-theorist interested in its preservation and substantial use of administrative resources. The author knows well what is totalitarianism, and regretfully sees signs of it in monopolistic domination of the world geotectonic «the only correct» plate tectonics theory. Scientists have been looking for the factual material in the field, most belong to the plate theory skeptical, to the extent that believe their own eyes more than books. Believing that science is a search for truth, not only grants, the author proposes to critically reconsider the position in modern geotectonic and look for a way out of the impasse. Obviously, if we are not satisfied with the existing paradigm, we should not be limited by its critics, and must seek an alternative concept, avoiding errors, for which we criticize plate tectonic. The new concept should be based on all the facts, using only the necessary minimum of modeling. Methodological principles of creation of the concept are presented to the author of the following: - strict adherence to scientific logic; - the constant application of the principle of Occam's razor; - ranking of existing tectonic information on groups, in descending order of reliability: 1) established facts 2) the facts to be checked 3) empirical generalizations 4) physical and other models, including the facts and their generalizations 5) theoretical constructions based on empirical generalizations and models 6) hypotheses arising from the grounded theoretical constructions 7) the concepts 8) ideas (Professor's theory or idea can cost less than a fact from a student). - generalization, rethinking the information according to the indicated rankings, including outside the boards paradigm; - establishment of boundary conditions of the action and the eligibility of the consequences of all newly created entity, strict adherence to these restrictions. In the new geotectonic, perhaps there is a place some synthesis with some provisions of the plate tectonic provided they are consistent with the above principles.

  17. Laser-induced retinal damage thresholds for annular retinal beam profiles

    NASA Astrophysics Data System (ADS)

    Kennedy, Paul K.; Zuclich, Joseph A.; Lund, David J.; Edsall, Peter R.; Till, Stephen; Stuck, Bruce E.; Hollins, Richard C.

    2004-07-01

    The dependence of retinal damage thresholds on laser spot size, for annular retinal beam profiles, was measured in vivo for 3 μs, 590 nm pulses from a flashlamp-pumped dye laser. Minimum Visible Lesion (MVL)ED50 thresholds in rhesus were measured for annular retinal beam profiles covering 5, 10, and 20 mrad of visual field; which correspond to outer beam diameters of roughly 70, 160, and 300 μm, respectively, on the primate retina. Annular beam profiles at the retinal plane were achieved using a telescopic imaging system, with the focal properties of the eye represented as an equivalent thin lens, and all annular beam profiles had a 37% central obscuration. As a check on experimental data, theoretical MVL-ED50 thresholds for annular beam exposures were calculated using the Thompson-Gerstman granular model of laser-induced thermal damage to the retina. Threshold calculations were performed for the three experimental beam diameters and for an intermediate case with an outer beam diameter of 230 μm. Results indicate that the threshold vs. spot size trends, for annular beams, are similar to the trends for top hat beams determined in a previous study; i.e., the threshold dose varies with the retinal image area for larger image sizes. The model correctly predicts the threshold vs. spot size trends seen in the biological data, for both annular and top hat retinal beam profiles.

  18. Magnetic Properties of nickel hydroxides layers 30A apart obtained by intercalation with dodecyl sulfate ion

    NASA Astrophysics Data System (ADS)

    Shmavonyan, Gagik; Zadoyan, Ovsanna

    2013-03-01

    Magnetic systems with reduced dimensionality make good test beds for checks on theoretical models. Here, changes in the nature of magnetic ordering in quasi-2d system of layered Ni hydroxides (LH-Ni-) with variations in the interlayer spacing c are investigated. Magnetic properties of LH-Ni-DS with c ~ 30 A° synthesized by intercalating dodecyl sulfate ion, (C12H25OSO3)- between the layers are compared with those of LH-Ni-Ac (c ~ 8.5 A°) containing the acetate (Ac) ligand. Measurements included those of magnetization M vs. T and H, ac susceptibilities (f = 0.1 Hz - 1000 Hz) and EMR (Electron Magnetic Resonance) spectra at 9.28 GHz. Results show that just like LH-Ni-Ac, LH-Ni-DS also orders ferromagnetically but with Tc ~ 23 Kabout 45 % largerthanT c 16 Kreportedfor LH-Ni-Ac.. In EMR studies, linewidth is strongly temperature-dependent, decreasing with decreasing T from 300 K, reaching a minimum near 45 K and then increasing sharply for T < 45 K, the latter due to short range magnetic ordering. These results differ with the model of Drillon et al in which interlayer dipolar interaction between clusters of correlated spins in the layers yields TC nearly independent of c. Roles of magnetic anisotropy and exchange constants in determining TC in the LH-Ni systems is discussed.

  19. Wave-vortex interactions in the nonlinear Schrödinger equation

    NASA Astrophysics Data System (ADS)

    Guo, Yuan; Bühler, Oliver

    2014-02-01

    This is a theoretical study of wave-vortex interaction effects in the two-dimensional nonlinear Schrödinger equation, which is a useful conceptual model for the limiting dynamics of superfluid quantum condensates at zero temperature. The particular wave-vortex interaction effects are associated with the scattering and refraction of small-scale linear waves by the straining flows induced by quantized point vortices and, crucially, with the concomitant nonlinear back-reaction, the remote recoil, that these scattered waves exert on the vortices. Our detailed model is a narrow, slowly varying wavetrain of small-amplitude waves refracted by one or two vortices. Weak interactions are studied using a suitable perturbation method in which the nonlinear recoil force on the vortex then arises at second order in wave amplitude, and is computed in terms of a Magnus-type force expression for both finite and infinite wavetrains. In the case of an infinite wavetrain, an explicit asymptotic formula for the scattering angle is also derived and cross-checked against numerical ray tracing. Finally, under suitable conditions a wavetrain can be so strongly refracted that it collapses all the way onto a zero-size point vortex. This is a strong wave-vortex interaction by definition. The conditions for such a collapse are derived and the validity of ray tracing theory during the singular collapse is investigated.

  20. Local electronic structure and nanolevel hierarchical organization of bone tissue: theory and NEXAFS study

    NASA Astrophysics Data System (ADS)

    Pavlychev, A. A.; Avrunin, A. S.; Vinogradov, A. S.; Filatova, E. O.; Doctorov, A. A.; Krivosenko, Yu S.; Samoilenko, D. O.; Svirskiy, G. I.; Konashuk, A. S.; Rostov, D. A.

    2016-12-01

    Theoretical and experimental investigations of native bone are carried out to understand relationships between its hierarchical organization and local electronic and atomic structure of the mineralized phase. The 3D superlattice model of a coplanar assembly of the hydroxyapatite (HAP) nanocrystallites separated by the hydrated nanolayers is introduced to account the interplay of short-, long- and super-range order parameters in bone tissue. The model is applied to (i) predict and rationalize the HAP-to-bone spectral changes in the electronic structure and (ii) describe the mechanisms ensuring the link of the hierarchical organization with the electronic structure of the mineralized phase in bone. To check the predictions the near-edge x-ray absorption fine structure (NEXAFS) at the Ca 2p, P 2p and O 1s thresholds is measured for native bone and compared with NEXAFS for reference compounds. The NEXAFS analysis has demonstrated the essential hierarchy induced HAP-to-bone red shifts of the Ca and P 2p-to-valence transitions. The lowest O 1s excitation line at 532.2 eV in bone is assigned with superposition of core transitions in the hydroxide OH-(H2O) m anions, Ca2+(H2O) n cations, the carboxyl groups inside the collagen and [PO4]2- and [PO4]- anions with unsaturated P-O bonds.

Top