A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.
Grando, M Adela; Glasspool, David; Fox, John
2012-01-01
To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
An Evaluation of the Preceptor Model versus the Formal Teaching Model.
ERIC Educational Resources Information Center
Shamian, Judith; Lemieux, Suzanne
1984-01-01
This study evaluated the effectiveness of two teaching methods to determine which is more effective in enhancing the knowledge base of participating nurses: the preceptor model embodies decentralized instruction by a member of the nursing staff, and the formal teaching model uses centralized teaching by the inservice education department. (JOW)
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Formal reasoning about systems biology using theorem proving
Hasan, Osman; Siddique, Umair; Tahar, Sofiène
2017-01-01
System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950
Modeling formalisms in Systems Biology
2011-01-01
Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422
ERIC Educational Resources Information Center
Goldratt, Miri; Cohen, Eric H.
2016-01-01
This article explores encounters between formal, informal, and non-formal education and the role of mentor-educators in creating values education in which such encounters take place. Mixed-methods research was conducted in Israeli public schools participating in the Personal Education Model, which combines educational modes. Ethnographic and…
ERIC Educational Resources Information Center
Mars, Matthew M.; Ball, Anna L.
2016-01-01
The mainstream agricultural literacy movement has been mostly focused on school-based learning through formal curricula and standardized non-formal models (e.g., FFA, 4-H). The purpose of the current study is to qualitatively explore through a grounded theory approach, the development, sharing, and translation of diverse forms of agricultural…
Verification of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.
NASA Technical Reports Server (NTRS)
Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno
2013-01-01
Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Epiphany? A Case Study of Learner-Centredness in Educational Supervision
ERIC Educational Resources Information Center
Talbot, Martin
2009-01-01
Graduate medical trainees in the UK appreciate mentors who demonstrate learner-centredness as modelled by Rogers. This case study was undertaken to examine how, in one instance, learner-centred may be supervision within the tight confines of a formal, competency-based programme of training. Four formal interviews (in 18 months), were analysed to…
Properties of a Formal Method to Model Emergence in Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.
New method of contour image processing based on the formalism of spiral light beams
NASA Astrophysics Data System (ADS)
Volostnikov, Vladimir G.; Kishkin, S. A.; Kotova, S. P.
2013-07-01
The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented.
A Pilot Study on Modeling of Diagnostic Criteria Using OWL and SWRL.
Hong, Na; Jiang, Guoqian; Pathak, Jyotishiman; Chute, Christopher G
2015-01-01
The objective of this study is to describe our efforts in a pilot study on modeling diagnostic criteria using a Semantic Web-based approach. We reused the basic framework of the ICD-11 content model and refined it into an operational model in the Web Ontology Language (OWL). The refinement is based on a bottom-up analysis method, in which we analyzed data elements (including value sets) in a collection (n=20) of randomly selected diagnostic criteria. We also performed a case study to formalize rule logic in the diagnostic criteria of metabolic syndrome using the Semantic Web Rule Language (SWRL). The results demonstrated that it is feasible to use OWL and SWRL to formalize the diagnostic criteria knowledge, and to execute the rules through reasoning.
A general U-block model-based design procedure for nonlinear polynomial control systems
NASA Astrophysics Data System (ADS)
Zhu, Q. M.; Zhao, D. Y.; Zhang, Jianhua
2016-10-01
The proposition of U-model concept (in terms of 'providing concise and applicable solutions for complex problems') and a corresponding basic U-control design algorithm was originated in the first author's PhD thesis. The term of U-model appeared (not rigorously defined) for the first time in the first author's other journal paper, which established a framework for using linear polynomial control system design approaches to design nonlinear polynomial control systems (in brief, linear polynomial approaches → nonlinear polynomial plants). This paper represents the next milestone work - using linear state-space approaches to design nonlinear polynomial control systems (in brief, linear state-space approaches → nonlinear polynomial plants). The overall aim of the study is to establish a framework, defined as the U-block model, which provides a generic prototype for using linear state-space-based approaches to design the control systems with smooth nonlinear plants/processes described by polynomial models. For analysing the feasibility and effectiveness, sliding mode control design approach is selected as an exemplary case study. Numerical simulation studies provide a user-friendly step-by-step procedure for the readers/users with interest in their ad hoc applications. In formality, this is the first paper to present the U-model-oriented control system design in a formal way and to study the associated properties and theorems. The previous publications, in the main, have been algorithm-based studies and simulation demonstrations. In some sense, this paper can be treated as a landmark for the U-model-based research from intuitive/heuristic stage to rigour/formal/comprehensive studies.
Formal Specification of Information Systems Requirements.
ERIC Educational Resources Information Center
Kampfner, Roberto R.
1985-01-01
Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)
Smith, Rebekah E; Bayen, Ute J; Martin, Claudia
2010-01-01
Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory performance. The formal modeling results demonstrate that adults differed significantly from the 7-year-olds and the 10-year-olds on both the prospective component and the retrospective component of the task. The 7-year-olds and the 10-year-olds differed only in the ability to recognize prospective memory target events. The prospective memory task imposed a cost to ongoing activities in all 3 age groups. Copyright 2009 APA, all rights reserved.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
Model Selection in Historical Research Using Approximate Bayesian Computation
Rubio-Campillo, Xavier
2016-01-01
Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes
NASA Technical Reports Server (NTRS)
Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.
2000-01-01
Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.
Evaluating a Control System Architecture Based on a Formally Derived AOCS Model
NASA Astrophysics Data System (ADS)
Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas
2010-08-01
Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene
2014-01-01
Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.
Offshore safety case approach and formal safety assessment of ships.
Wang, J
2002-01-01
Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Protege Career Aspirations: The Influence of Formal E-Mentor Networks and Family-Based Role Models
ERIC Educational Resources Information Center
DiRenzo, Marco S.; Weer, Christy H.; Linnehan, Frank
2013-01-01
Using longitudinal data from a nine-month e-mentoring program, we analyzed the influence of formal e-mentor networks and family-based role models on increases in both psychosocial and career-related outcomes. Findings indicate that e-mentor network relationship quality positively influenced general- and career-based self-efficacy which, in turn,…
Dependability modeling and assessment in UML-based software development.
Bernardi, Simona; Merseguer, José; Petriu, Dorina C
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.
Dependability Modeling and Assessment in UML-Based Software Development
Bernardi, Simona; Merseguer, José; Petriu, Dorina C.
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428
A Formal Valuation Framework for Emotions and Their Control.
Huys, Quentin J M; Renz, Daniel
2017-09-15
Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.
Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat
2009-08-01
To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.
Formal Analysis of Self-Efficacy in Job Interviewee’s Mental State Model
NASA Astrophysics Data System (ADS)
Ajoge, N. S.; Aziz, A. A.; Yusof, S. A. Mohd
2017-08-01
This paper presents a formal analysis approach for self-efficacy model of interviewee’s mental state during a job interview session. Self-efficacy is a construct that has been hypothesised to combine with motivation and interviewee anxiety to define state influence of interviewees. The conceptual model was built based on psychological theories and models related to self-efficacy. A number of well-known relations between events and the course of self-efficacy are summarized from the literature and it is shown that the proposed model exhibits those patterns. In addition, this formal model has been mathematically analysed to find out which stable situations exist. Finally, it is pointed out how this model can be used in a software agent or robot-based platform. Such platform can provide an interview coaching approach where support to the user is provided based on their individual metal state during interview sessions.
Guideline validation in multiple trauma care through business process modeling.
Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen
2003-07-01
Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.
A Model-based Approach to Reactive Self-Configuring Systems
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Nayak, P. Pandurang
1996-01-01
This paper describes Livingstone, an implemented kernel for a self-reconfiguring autonomous system, that is reactive and uses component-based declarative models. The paper presents a formal characterization of the representation formalism used in Livingstone, and reports on our experience with the implementation in a variety of domains. Livingstone's representation formalism achieves broad coverage of hybrid software/hardware systems by coupling the concurrent transition system models underlying concurrent reactive languages with the discrete qualitative representations developed in model-based reasoning. We achieve a reactive system that performs significant deductions in the sense/response loop by drawing on our past experience at building fast prepositional conflict-based algorithms for model-based diagnosis, and by framing a model-based configuration manager as a prepositional, conflict-based feedback controller that generates focused, optimal responses. Livingstone automates all these tasks using a single model and a single core deductive engine, thus making significant progress towards achieving a central goal of model-based reasoning. Livingstone, together with the HSTS planning and scheduling engine and the RAPS executive, has been selected as the core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millennium program.
Study of the 190Hg Nucleus: Testing the Existence of U(5) Symmetry
NASA Astrophysics Data System (ADS)
Jahangiri Tazekand, Z.; Mohseni, M.; Mohammadi, M. A.; Sabri, H.
2018-06-01
In this paper, we have considered the energy spectra, quadrupole transition probabilities, energy surface, charge radii, and quadrupole moment of the190Hg nucleus to describe the interplay between phase transitions and configuration mixing of intruder excitations. To this aim, we have used four different formalisms: (i) interacting boson model including configuration mixing, (ii) Z(5) critical symmetry, (iii) U(6)-based transitional Hamiltonian, and (iv) a transitional interacting boson model Hamiltonian in both interacting boson model (IBM)-1 and IBM-2 versions which are based on affine \\widehat{SU(1,1)} Lie algebra. Results show the advantages of configuration mixing and transitional Hamiltonians, in particular IBM-2 formalism, to reproduce the experimental counterparts when the weight of spherical symmetry increased.
Matching biomedical ontologies based on formal concept analysis.
Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei
2018-03-19
The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
A Tool for Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
NASA Astrophysics Data System (ADS)
Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe
2017-10-01
We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Software Validation via Model Animation
NASA Technical Reports Server (NTRS)
Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.
2015-01-01
This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.
Formal implementation of a performance evaluation model for the face recognition system.
Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young
2008-01-01
Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.
A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes
NASA Astrophysics Data System (ADS)
Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria
In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.
Model-based approach to study the impact of biofuels on the sustainability of an ecological system
The importance and complexity of sustainability have been well recognized and a formal study of sustainability based on system theory approaches is imperative as many of the relationships between various components of the ecosystem could be nonlinear, intertwined and non-intuitiv...
Model based approach to Study the Impact of Biofuels on the Sustainability of an Ecological System
The importance and complexity of sustainability has been well recognized and a formal study of sustainability based on system theory approaches is imperative as many of the relationships between various components of the ecosystem could be nonlinear, intertwined and non intuitive...
A robot sets a table: a case for hybrid reasoning with different types of knowledge
NASA Astrophysics Data System (ADS)
Mansouri, Masoumeh; Pecora, Federico
2016-09-01
An important contribution of AI to Robotics is the model-centred approach, whereby competent robot behaviour stems from automated reasoning in models of the world which can be changed to suit different environments, physical capabilities and tasks. However models need to capture diverse (and often application-dependent) aspects of the robot's environment and capabilities. They must also have good computational properties, as robots need to reason while they act in response to perceived context. In this article, we investigate the use of a meta-CSP-based technique to interleave reasoning in diverse knowledge types. We reify the approach through a robotic waiter case study, for which a particular selection of spatial, temporal, resource and action KR formalisms is made. Using this case study, we discuss general principles pertaining to the selection of appropriate KR formalisms and jointly reasoning about them. The resulting integration is evaluated both formally and experimentally on real and simulated robotic platforms.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Improved formalism for precision Higgs coupling fits
NASA Astrophysics Data System (ADS)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Karl, Robert; List, Jenny; Ogawa, Tomohisa; Peskin, Michael E.; Tian, Junping
2018-03-01
Future e+e- colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e+e- data, based on the effective field theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e+e- colliders.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)
NASA Technical Reports Server (NTRS)
Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)
2004-01-01
These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.
Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.
F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming
NASA Technical Reports Server (NTRS)
DiNucci, David C.; Saini, Subhash (Technical Monitor)
1998-01-01
Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).
Formal Methods for Automated Diagnosis of Autosub 6000
NASA Technical Reports Server (NTRS)
Ernits, Juhan; Dearden, Richard; Pebody, Miles
2009-01-01
This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.
Logic-Based Models for the Analysis of Cell Signaling Networks†
2010-01-01
Computational models are increasingly used to analyze the operation of complex biochemical networks, including those involved in cell signaling networks. Here we review recent advances in applying logic-based modeling to mammalian cell biology. Logic-based models represent biomolecular networks in a simple and intuitive manner without describing the detailed biochemistry of each interaction. A brief description of several logic-based modeling methods is followed by six case studies that demonstrate biological questions recently addressed using logic-based models and point to potential advances in model formalisms and training procedures that promise to enhance the utility of logic-based methods for studying the relationship between environmental inputs and phenotypic or signaling state outputs of complex signaling networks. PMID:20225868
Formalization of an environmental model using formal concept analysis - FCA
NASA Astrophysics Data System (ADS)
Bourdon-García, Rubén D.; Burgos-Salcedo, Javier D.
2016-08-01
Nowadays, there is a huge necessity to generate novel strategies for social-ecological systems analyses for resolving global sustainability problems. This paper has as main purpose the application of the formal concept analysis to formalize the theory of Augusto Ángel Maya, who without a doubt, was one of the most important environmental philosophers in South America; Ángel Maya proposed and established that Ecosystem-Culture relations, instead Human-Nature ones, are determinants in our understanding and management of natural resources. Based on this, a concept lattice, formal concepts, subconcept-superconcept relations, partially ordered sets, supremum and infimum of the lattice and implications between attributes (Duquenne-Guigues base), were determined for the ecosystem-culture relations.
Symbolic discrete event system specification
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.; Chi, Sungdo
1992-01-01
Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.
Improved formalism for precision Higgs coupling fits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon
Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.
Improved formalism for precision Higgs coupling fits
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; ...
2018-03-20
Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.
Leading the Teacher Team--Balancing between Formal and Informal Power in Program Leadership
ERIC Educational Resources Information Center
Högfeldt, Anna-Karin; Malmi, Lauri; Kinnunen, Päivi; Jerbrant, Anna; Strömberg, Emma; Berglund, Anders; Villadsen, Jørgen
2018-01-01
This continuous research within Nordic engineering institutions targets the contexts and possibilities for leadership among engineering education program directors. The IFP-model, developed based on analysis of interviews with program leaders in these institutions, visualizes the program director's informal and formal power. The model is presented…
Formal Semantics and Implementation of BPMN 2.0 Inclusive Gateways
NASA Astrophysics Data System (ADS)
Christiansen, David Raymond; Carbone, Marco; Hildebrandt, Thomas
We present the first direct formalization of the semantics of inclusive gateways as described in the Business Process Modeling Notation (BPMN) 2.0 Beta 1 specification. The formal semantics is given for a minimal subset of BPMN 2.0 containing just the inclusive and exclusive gateways and the start and stop events. By focusing on this subset we achieve a simple graph model that highlights the particular non-local features of the inclusive gateway semantics. We sketch two ways of implementing the semantics using algorithms based on incrementally updated data structures and also discuss distributed communication-based implementations of the two algorithms.
Machine Learning-based Intelligent Formal Reasoning and Proving System
NASA Astrophysics Data System (ADS)
Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia
2018-03-01
The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.
Stopping power of dense plasmas: The collisional method and limitations of the dielectric formalism.
Clauser, C F; Arista, N R
2018-02-01
We present a study of the stopping power of plasmas using two main approaches: the collisional (scattering theory) and the dielectric formalisms. In the former case, we use a semiclassical method based on quantum scattering theory. In the latter case, we use the full description given by the extension of the Lindhard dielectric function for plasmas of all degeneracies. We compare these two theories and show that the dielectric formalism has limitations when it is used for slow heavy ions or atoms in dense plasmas. We present a study of these limitations and show the regimes where the dielectric formalism can be used, with appropriate corrections to include the usual quantum and classical limits. On the other hand, the semiclassical method shows the correct behavior for all plasma conditions and projectile velocity and charge. We consider different models for the ion charge distributions, including bare and dressed ions as well as neutral atoms.
Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.
2006-01-01
NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.
C code generation from Petri-net-based logic controller specification
NASA Astrophysics Data System (ADS)
Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei
2017-08-01
The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Formal Techniques for Synchronized Fault-Tolerant Systems
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Butler, Ricky W.
1992-01-01
We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.
Goel, Purva; Bapat, Sanket; Vyas, Renu; Tambe, Amruta; Tambe, Sanjeev S
2015-11-13
The development of quantitative structure-retention relationships (QSRR) aims at constructing an appropriate linear/nonlinear model for the prediction of the retention behavior (such as Kovats retention index) of a solute on a chromatographic column. Commonly, multi-linear regression and artificial neural networks are used in the QSRR development in the gas chromatography (GC). In this study, an artificial intelligence based data-driven modeling formalism, namely genetic programming (GP), has been introduced for the development of quantitative structure based models predicting Kovats retention indices (KRI). The novelty of the GP formalism is that given an example dataset, it searches and optimizes both the form (structure) and the parameters of an appropriate linear/nonlinear data-fitting model. Thus, it is not necessary to pre-specify the form of the data-fitting model in the GP-based modeling. These models are also less complex, simple to understand, and easy to deploy. The effectiveness of GP in constructing QSRRs has been demonstrated by developing models predicting KRIs of light hydrocarbons (case study-I) and adamantane derivatives (case study-II). In each case study, two-, three- and four-descriptor models have been developed using the KRI data available in the literature. The results of these studies clearly indicate that the GP-based models possess an excellent KRI prediction accuracy and generalization capability. Specifically, the best performing four-descriptor models in both the case studies have yielded high (>0.9) values of the coefficient of determination (R(2)) and low values of root mean squared error (RMSE) and mean absolute percent error (MAPE) for training, test and validation set data. The characteristic feature of this study is that it introduces a practical and an effective GP-based method for developing QSRRs in gas chromatography that can be gainfully utilized for developing other types of data-driven models in chromatography science. Copyright © 2015 Elsevier B.V. All rights reserved.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast
Pang, Wei; Coghill, George M.
2015-01-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377
Towards Formal Implementation of PUS Standard
NASA Astrophysics Data System (ADS)
Ilić, D.
2009-05-01
As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Applying Evidence-Based Medicine in Telehealth: An Interactive Pattern Recognition Approximation
Fernández-Llatas, Carlos; Meneu, Teresa; Traver, Vicente; Benedi, José-Miguel
2013-01-01
Born in the early nineteen nineties, evidence-based medicine (EBM) is a paradigm intended to promote the integration of biomedical evidence into the physicians daily practice. This paradigm requires the continuous study of diseases to provide the best scientific knowledge for supporting physicians in their diagnosis and treatments in a close way. Within this paradigm, usually, health experts create and publish clinical guidelines, which provide holistic guidance for the care for a certain disease. The creation of these clinical guidelines requires hard iterative processes in which each iteration supposes scientific progress in the knowledge of the disease. To perform this guidance through telehealth, the use of formal clinical guidelines will allow the building of care processes that can be interpreted and executed directly by computers. In addition, the formalization of clinical guidelines allows for the possibility to build automatic methods, using pattern recognition techniques, to estimate the proper models, as well as the mathematical models for optimizing the iterative cycle for the continuous improvement of the guidelines. However, to ensure the efficiency of the system, it is necessary to build a probabilistic model of the problem. In this paper, an interactive pattern recognition approach to support professionals in evidence-based medicine is formalized. PMID:24185841
ERIC Educational Resources Information Center
Jarosz, Gaja
2010-01-01
This study examines the interacting roles of implicational markedness and frequency from the joint perspectives of formal linguistic theory, phonological acquisition and computational modeling. The hypothesis that child grammars are rankings of universal constraints, as in Optimality Theory (Prince & Smolensky, 1993/2004), that learning involves a…
Application of Lightweight Formal Methods to Software Security
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt
2005-01-01
Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.
Kulczycki, Emanuel; Rozkosz, Ewa A
2017-01-01
This article discusses the Polish Journal Ranking, which is used in the research evaluation system in Poland. In 2015, the ranking, which represents all disciplines, allocated 17,437 journals into three lists: A, B, and C. The B list constitutes a ranking of Polish journals that are indexed neither in the Web of Science nor the European Reference Index for the Humanities. This ranking was built by evaluating journals in three dimensions: formal, bibliometric, and expert-based. We have analysed data on 2035 Polish journals from the B list. Our study aims to determine how an expert-based evaluation influenced the results of final evaluation. In our study, we used structural equation modelling, which is regression based, and we designed three pairs of theoretical models for three fields of science: (1) humanities, (2) social sciences, and (3) engineering, natural sciences, and medical sciences. Each pair consisted of the full model and the reduced model (i.e., the model without the expert-based evaluation). Our analysis revealed that the multidimensional evaluation of local journals should not rely only on the bibliometric indicators, which are based on the Web of Science or Scopus. Moreover, we have shown that the expert-based evaluation plays a major role in all fields of science. We conclude with recommendations that the formal evaluation should be reduced to verifiable parameters and that the expert-based evaluation should be based on common guidelines for the experts.
Promoting Post-Formal Thinking in a U.S. History Survey Course: A Problem-Based Approach
ERIC Educational Resources Information Center
Wynn, Charles T.; Mosholder, Richard S.; Larsen, Carolee A.
2016-01-01
This article presents a problem-based learning (PBL) model for teaching a college U.S. history survey course (U.S. history since 1890) designed to promote postformal thinking skills and identify and explain thinking systems inherent in adult complex problem-solving. We also present the results of a study in which the outcomes of the PBL model were…
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Developing a Common Metadata Model for Competencies Description
ERIC Educational Resources Information Center
Sampson, Demetrios; Karampiperis, Pythagoras; Fytros, Demetrios
2007-01-01
Competence-based approaches are frequently adopted as the key paradigm in both formal or non-formal education and training. To support the provision of competence-based learning services, it is necessary to be able to maintain a record of an individual's competences in a persistent and standard way. In this paper, we investigate potential issues…
A High-Level Language for Modeling Algorithms and Their Properties
NASA Astrophysics Data System (ADS)
Akhtar, Sabina; Merz, Stephan; Quinson, Martin
Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.
A Methodology for Formal Hardware Verification, with Application to Microprocessors.
1993-08-29
concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as
Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified
NASA Technical Reports Server (NTRS)
Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.
2005-01-01
Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.
An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi
NASA Astrophysics Data System (ADS)
Deng, D.-P.; Lemmens, R.
2011-08-01
The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.
An Analysis of the Formal Features of "Reality-Based" Television Programs.
ERIC Educational Resources Information Center
Neapolitan, D. M.
Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
SS-HORSE method for studying resonances
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blokhintsev, L. D.; Mazur, A. I.; Mazur, I. A., E-mail: 008043@pnu.edu.ru
A new method for analyzing resonance states based on the Harmonic-Oscillator Representation of Scattering Equations (HORSE) formalism and analytic properties of partial-wave scattering amplitudes is proposed. The method is tested by applying it to the model problem of neutral-particle scattering and can be used to study resonance states on the basis of microscopic calculations performed within various versions of the shell model.
The VATES-Diamond as a Verifier's Best Friend
NASA Astrophysics Data System (ADS)
Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz
Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.
Pang, Wei; Coghill, George M
2015-05-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
1996-06-01
for Software Synthesis." KBSE . IEEE, 1993. 51. Kang, Kyo C., et al. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report...and usefulness in domain analysis and modeling. Rumbaugh uses three distinct views to describe a domain: (1) the object model describes structural...Gibbons describe a methodology where Structured Analysis is used to build a hierarchical system structure chart. This structure chart is then translated
Dissipative transport in superlattices within the Wigner function formalism
Jonasson, O.; Knezevic, I.
2015-07-30
Here, we employ the Wigner function formalism to simulate partially coherent, dissipative electron transport in biased semiconductor superlattices. We introduce a model collision integral with terms that describe energy dissipation, momentum relaxation, and the decay of spatial coherences (localization). Based on a particle-based solution to the Wigner transport equation with the model collision integral, we simulate quantum electronic transport at 10 K in a GaAs/AlGaAs superlattice and accurately reproduce its current density vs field characteristics obtained in experiment.
Using VCL as an Aspect-Oriented Approach to Requirements Modelling
NASA Astrophysics Data System (ADS)
Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian
Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.
Interpersonal Harmony and Conflict for Chinese People: A Yin-Yang Perspective.
Huang, Li-Li
2016-01-01
This article provides an overview on a series of original studies conducted by the author. The aim here is to present the ideas that the author reconstructed, based on the dialectics of harmonization, regarding harmony and conflict embodied in traditional Chinese thought, and to describe how a formal psychological theory/model on interpersonal harmony and conflict was developed based on the Yin-Yang perspective. The paper also details how essential theories on interpersonal harmony and conflict were constructed under this formal model by conducting a qualitative study involving in-depth interviews with 30 adults. Psychological research in Western society has, intriguingly, long been focused more on interpersonal conflict than on interpersonal harmony. By contrast, the author's work started from the viewpoint of a materialist conception of history and dialectics of harmonization in order to reinterpret traditional Chinese thought. Next, a "dynamic model of interpersonal harmony and conflict" was developed, as a formal psychological theory, based on the real-virtual notions in the Yin-Yang perspective. Under this model, interpersonal harmony and conflict can be classified into genuine versus superficial harmony and authentic versus virtual focus conflict, and implicit/hidden conflict is regarded as superficial harmony. Subsequently, the author conducted a series of quantitative studies on interpersonal harmony and conflict within parent-child, supervisor-subordinate, and friend-friend relationships in order to verify the construct validity and the predictive validity of the dynamic model of interpersonal harmony and conflict. The claim presented herein is that Chinese traditional thought and the psychological theory/model based on the Yin-Yang perspective can be combined. Accordingly, by combining qualitative and quantitative empirical research, the relative substantial theory can be developed and the concepts can be validated. Thus, this work represents the realization of a series of modern Chinese indigenous psychological research studies rooted in traditional cultural thought and the Yin-Yang perspective. The work also mirrors the current conflict-management research that has incorporated the Chinese notion of harmony and adopted the Yin-Yang perspective on culture.
Interpersonal Harmony and Conflict for Chinese People: A Yin–Yang Perspective
Huang, Li-Li
2016-01-01
This article provides an overview on a series of original studies conducted by the author. The aim here is to present the ideas that the author reconstructed, based on the dialectics of harmonization, regarding harmony and conflict embodied in traditional Chinese thought, and to describe how a formal psychological theory/model on interpersonal harmony and conflict was developed based on the Yin–Yang perspective. The paper also details how essential theories on interpersonal harmony and conflict were constructed under this formal model by conducting a qualitative study involving in-depth interviews with 30 adults. Psychological research in Western society has, intriguingly, long been focused more on interpersonal conflict than on interpersonal harmony. By contrast, the author’s work started from the viewpoint of a materialist conception of history and dialectics of harmonization in order to reinterpret traditional Chinese thought. Next, a “dynamic model of interpersonal harmony and conflict” was developed, as a formal psychological theory, based on the real-virtual notions in the Yin–Yang perspective. Under this model, interpersonal harmony and conflict can be classified into genuine versus superficial harmony and authentic versus virtual focus conflict, and implicit/hidden conflict is regarded as superficial harmony. Subsequently, the author conducted a series of quantitative studies on interpersonal harmony and conflict within parent–child, supervisor–subordinate, and friend–friend relationships in order to verify the construct validity and the predictive validity of the dynamic model of interpersonal harmony and conflict. The claim presented herein is that Chinese traditional thought and the psychological theory/model based on the Yin–Yang perspective can be combined. Accordingly, by combining qualitative and quantitative empirical research, the relative substantial theory can be developed and the concepts can be validated. Thus, this work represents the realization of a series of modern Chinese indigenous psychological research studies rooted in traditional cultural thought and the Yin–Yang perspective. The work also mirrors the current conflict-management research that has incorporated the Chinese notion of harmony and adopted the Yin–Yang perspective on culture. PMID:27375526
A Model for Developing and Assessing Youth-Based Environmental Engagement Programmes
ERIC Educational Resources Information Center
Riemer, Manuel; Lynes, Jennifer; Hickman, Gina
2014-01-01
In this paper, we argue that a fundamental cultural shift is needed to effectively address anthropogenic causes of climate change. Evidence suggests that youth are well positioned to create such transformation. While various studies have contributed empirical evidence to numerous youth-based non-formal environmental engagement programmes, what is…
Using Petri Net Tools to Study Properties and Dynamics of Biological Systems
Peleg, Mor; Rubin, Daniel; Altman, Russ B.
2005-01-01
Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791
Weinkam, Patrick; Romesberg, Floyd E.; Wolynes, Peter G.
2010-01-01
A grand canonical formalism is developed to combine discrete simulations for chemically distinct species in equilibrium. Each simulation is based on a perturbed funneled landscape. The formalism is illustrated using the alkaline-induced transitions of cytochrome c as observed by FTIR spectroscopy and with various other experimental approaches. The grand canonical simulation method accounts for the acid/base chemistry of deprotonation, the inorganic chemistry of heme ligation and misligation, and the minimally frustrated folding energy landscape, thus elucidating the physics of protein folding involved with an acid/base titration of a protein. The formalism combines simulations for each of the relevant chemical species, varying by protonation and ligation states. In contrast to models based on perfectly funneled energy landscapes that contain only contacts found in the native structure, the current study introduces “chemical frustration” from deprotonation and misligation that gives rise to many intermediates at alkaline pH. While the nature of these intermediates cannot be easily inferred from available experimental data, the current study provides specific structural details of these intermediates thus extending our understanding of how cytochrome c changes with increasing pH. The results demonstrate the importance of chemical frustration for understanding biomolecular energy landscapes. PMID:19199810
Using Data-Driven Model-Brain Mappings to Constrain Formal Models of Cognition
Borst, Jelmer P.; Nijboer, Menno; Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John R.
2015-01-01
In this paper we propose a method to create data-driven mappings from components of cognitive models to brain regions. Cognitive models are notoriously hard to evaluate, especially based on behavioral measures alone. Neuroimaging data can provide additional constraints, but this requires a mapping from model components to brain regions. Although such mappings can be based on the experience of the modeler or on a reading of the literature, a formal method is preferred to prevent researcher-based biases. In this paper we used model-based fMRI analysis to create a data-driven model-brain mapping for five modules of the ACT-R cognitive architecture. We then validated this mapping by applying it to two new datasets with associated models. The new mapping was at least as powerful as an existing mapping that was based on the literature, and indicated where the models were supported by the data and where they have to be improved. We conclude that data-driven model-brain mappings can provide strong constraints on cognitive models, and that model-based fMRI is a suitable way to create such mappings. PMID:25747601
ERIC Educational Resources Information Center
Santally, Mohammad Issack; Cooshna-Naik, Dorothy; Conruyt, Noel; Wing, Caroline Koa
2015-01-01
This paper describes a social partnership model based on the living lab concept to promote the professional development of educators through formal and informal capacity-building initiatives. The aim is to have a broader impact on society through community outreach educational initiatives. A Living Lab is an environment for user-centered…
EFL Teachers' Formal Assessment Practices Based on Exam Papers
ERIC Educational Resources Information Center
Kiliçkaya, Ferit
2016-01-01
This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…
Discrete-Slots Models of Visual Working-Memory Response Times
Donkin, Christopher; Nosofsky, Robert M.; Gold, Jason M.; Shiffrin, Richard M.
2014-01-01
Much recent research has aimed to establish whether visual working memory (WM) is better characterized by a limited number of discrete all-or-none slots or by a continuous sharing of memory resources. To date, however, researchers have not considered the response-time (RT) predictions of discrete-slots versus shared-resources models. To complement the past research in this field, we formalize a family of mixed-state, discrete-slots models for explaining choice and RTs in tasks of visual WM change detection. In the tasks under investigation, a small set of visual items is presented, followed by a test item in 1 of the studied positions for which a change judgment must be made. According to the models, if the studied item in that position is retained in 1 of the discrete slots, then a memory-based evidence-accumulation process determines the choice and the RT; if the studied item in that position is missing, then a guessing-based accumulation process operates. Observed RT distributions are therefore theorized to arise as probabilistic mixtures of the memory-based and guessing distributions. We formalize an analogous set of continuous shared-resources models. The model classes are tested on individual subjects with both qualitative contrasts and quantitative fits to RT-distribution data. The discrete-slots models provide much better qualitative and quantitative accounts of the RT and choice data than do the shared-resources models, although there is some evidence for “slots plus resources” when memory set size is very small. PMID:24015956
A rule-based approach to model checking of UML state machines
NASA Astrophysics Data System (ADS)
Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz
2016-12-01
In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.
Formal Methods for Verification and Validation of Partial Specifications: A Case Study
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
Chen, Chia-Ching; Yamada, Tetsuji; Nakashima, Taeko; Chiu, I-Ming
2017-01-01
The purposes of this study are: (1) to empirically identify decision-making preferences of long-term health-care use, especially informal and formal home care (FHC) service use; (2) to evaluate outcomes vs. costs based on substitutability of informal and FHC service use; and (3) to investigate health outcome disparity based on substitutability. The methods of ordinary least squares, a logit model, and a bivariate probit model are used by controlling for socioeconomic, demographic, and physical/mental health factors to investigate outcomes and costs based substitutability of informal and formal health-care use. The data come from the 2013 Japanese Study of Aging and Retirement (JSTAR), which is designed by Keizai-Sangyo Kenkyu-jo, Hitotsubashi University, and the University of Tokyo. The JSTAR is a globally comparable data survey of the elderly. There exists a complement relationship between the informal home care (IHC) and community-based FHC services, and the elasticity's ranges from 0.18 to 0.22. These are reasonable results, which show that unobservable factors are positively related to IHC and community-based FHC, but negatively related to nursing home (NH) services based on our bivariate probit model. Regarding health-care outcome efficiency issue, the IHC is the best one among three types of elderly care: IHC, community-based FHC, and NH services. Health improvement/outcome of elderly with the IHC is heavier concentrated on IHC services than the elderly care services by community-based FHC and NH care services. Policy makers need to address a diversity of health outcomes and efficiency of services based on providing services to elderly through resource allocation to the different types of long-term care. A provision of partial or full compensation for elderly care at home is recommendable and a viable option to improve their quality of lives.
Ontology for assessment studies of human-computer-interaction in surgery.
Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen
2015-02-01
New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks. Furthermore, it allows to acquire the structured description of the applied assessment methods within a certain surgical domain and to consider this information for own study design or to perform a comparison of different studies. The investigation model and the corresponding ontology can be used further to create new knowledge bases of HCI assessment in surgery. Copyright © 2014 Elsevier B.V. All rights reserved.
Formal modeling of a system of chemical reactions under uncertainty.
Ghosh, Krishnendu; Schlipf, John
2014-10-01
We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.
Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric; Duraisamy, Karthk
2017-11-01
The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
On the Formal Verification of Conflict Detection Algorithms
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles
2001-01-01
Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.
ERIC Educational Resources Information Center
Brown, Ralph B.
Effective rural education depends on active community involvement. This ethnographic case study examines three models of community organization as an explanation of how community action occurs. The three models are: (1) individuals interacting in formal and informal groups; (2) networks of "weak ties" effective for diffusing information and…
Applications of a formal approach to decipher discrete genetic networks.
Corblin, Fabien; Fanchon, Eric; Trilling, Laurent
2010-07-20
A growing demand for tools to assist the building and analysis of biological networks exists in systems biology. We argue that the use of a formal approach is relevant and applicable to address questions raised by biologists about such networks. The behaviour of these systems being complex, it is essential to exploit efficiently every bit of experimental information. In our approach, both the evolution rules and the partial knowledge about the structure and the behaviour of the network are formalized using a common constraint-based language. In this article our formal and declarative approach is applied to three biological applications. The software environment that we developed allows to specifically address each application through a new class of biologically relevant queries. We show that we can describe easily and in a formal manner the partial knowledge about a genetic network. Moreover we show that this environment, based on a constraint algorithmic approach, offers a wide variety of functionalities, going beyond simple simulations, such as proof of consistency, model revision, prediction of properties, search for minimal models relatively to specified criteria. The formal approach proposed here deeply changes the way to proceed in the exploration of genetic and biochemical networks, first by avoiding the usual trial-and-error procedure, and second by placing the emphasis on sets of solutions, rather than a single solution arbitrarily chosen among many others. Last, the constraint approach promotes an integration of model and experimental data in a single framework.
Formal Assurance for Cognitive Architecture Based Autonomous Agent
NASA Technical Reports Server (NTRS)
Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco
2017-01-01
Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.
FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER PHYSICAL SYSTEMS
2018-02-23
FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL SYSTEMS UNIVERSITY OF TEXAS AT ARLINGTON FEBRUARY 2018 FINAL...COVERED (From - To) APR 2015 – APR 2017 4. TITLE AND SUBTITLE FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL ...dated 16 Jan 09 13. SUPPLEMENTARY NOTES 14. ABSTRACT This project studied emergent behavior in distributed cyber- physical systems (DCPS). Emergent
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.
Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko
2015-10-30
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems
Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko
2015-01-01
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982
NASA Astrophysics Data System (ADS)
Branco, Nilton; Oliveira, Tharnier; Silveira, Jaylson
2012-02-01
The goal of this work is to study rural-urban migration in the early stages of industrialization. We use an agent-based model and take into account the existence of informal and formal workers on the urban sector and possible migration movements, dependent on the agents' social and private utilities. Our agents are place on vertices of a square lattice, such that each vertex has only one agent. Rural, urban informal and urban formal workers are represented by different states of a three-state Ising model. At every step, a fraction a of the agents may change sectors or migrate. The total utility of a given agent is then calculated and compared to a random utility, in order to check if this agent turns into an actual migrant or changes sector. The dynamics is carried out until an equilibrium state is reached and equilibrium variables are then calculated and compared to available data. We find that a generalized Harris-Todaro condition is satisfied [1] on these equilibrium regimes, i.e, the ratio between expected wages between any pair of sectors reach a constant value. [4pt] [1] J. J. Silveira, A. L. Esp'indola and T. J. Penna, Physica A, 364, 445 (2006).
Skwarchuk, Sheri-Lynn; Sowinski, Carla; LeFevre, Jo-Anne
2014-05-01
The purpose of this study was to propose and test a model of children's home numeracy experience based on Sénéchal and LeFevre's home literacy model (Child Development, 73 (2002) 445-460). Parents of 183 children starting kindergarten in the fall (median child age=58 months) completed an early home learning experiences questionnaire. Most of the children whose parents completed the questionnaire were recruited for numeracy and literacy testing 1 year later (along with 32 children from the inner city). Confirmatory factor analyses were used to reduce survey items, and hierarchical regression analyses were used to predict the relation among parents' attitudes, academic expectations for their children, reports of formal and informal numeracy, and literacy home practices on children's test scores. Parental reports of formal home numeracy practices (e.g., practicing simple sums) predicted children's symbolic number system knowledge, whereas reports of informal exposure to games with numerical content (measured indirectly through parents' knowledge of children's games) predicted children's non-symbolic arithmetic, as did numeracy attitudes (e.g., parents' enjoyment of numeracy). The home literacy results replicated past findings; parental reports of formal literacy practices (e.g., helping their children to read words) predicted children's word reading, whereas reports of informal experiences (i.e., frequency of shared reading measured indirectly through parents' storybook knowledge) predicted children's vocabulary. These findings support a multifaceted model of children's early numeracy environment, with different types of early home experiences (formal and informal) predicting different numeracy outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.
Suka, Machi; Yamauchi, Takashi; Sugimori, Hiroki
2015-01-01
Objective Encouraging help-seeking for mental illness is essential for prevention of suicide. This study examined the relationship between individual characteristics, neighbourhood contexts and help-seeking intentions for mental illness for the purpose of elucidating the role of neighbourhood in the help-seeking process. Design, setting and participants A cross-sectional web-based survey was conducted among Japanese adults aged 20–59 years in June 2014. Eligible respondents who did not have a serious health condition were included in this study (n=3308). Main outcome measures Participants were asked how likely they would be to seek help from someone close to them (informal help) and medical professionals (formal help), respectively, if they were suffering from serious mental illness. Path analysis with structural equation modelling was performed to represent plausible connections between individual characteristics, neighbourhood contexts, and informal and formal help-seeking intentions. Results The acceptable fitting model indicated that those who had a tendency to consult about everyday affairs were significantly more likely to express an informal help-seeking intention that was directly associated with a formal help-seeking intention. Those living in a communicative neighbourhood, where neighbours say hello whenever they pass each other, were significantly more likely to express informal and formal help-seeking intentions. Those living in a supportive neighbourhood, where neighbours work together to solve neighbourhood problems, were significantly more likely to express an informal help-seeking intention. Adequate health literacy was directly associated with informal and formal help-seeking intentions, along with having an indirect effect on the formal help-seeking intention through developed positive perception of professional help. Conclusions The results of this study bear out the hypothesis that neighbourhood context contributes to help-seeking intentions for mental illness. Living in a neighbourhood with a communicative atmosphere and having adequate health literacy were acknowledged as possible facilitating factors for informal and formal help-seeking for mental illness. PMID:26264273
State Event Models for the Formal Analysis of Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles
2014-01-01
The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.
Formal Requirements-Based Programming for Complex Systems
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis
2005-01-01
Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.
Le Bihan, Blanche
2012-05-01
This article investigates the impact of policy measures on the organisation of home-based care for older people in France, by examining the balance between formal and informal care and the redefinition of the initial familialist model. It focuses on the specific cash for care scheme (the Allocation personnalisée d'autonomie - Personalised allowance for autonomy) which is at the core of the French home-based care policy. The author argues that in a redefined context of 'welfare mix', the French public strategy for supporting home-based care in France is articulated around two major objectives, which can appear contradictory. It aims to formalise a professional care sector, with respect to the employment policy while allowing the development of new forms of informal care, which cannot be considered to be formal employment. The data collection is two-fold. Firstly, a detailed analysis was made of different policy documents and public reports, together with a systematic review of existing studies. Secondly, statistical analysis on home-based care resources were collected, which was not easy, as home-care services for older people in France are part of a larger sector of activity, 'personal services' (services à la personne). The article exposes three main findings. First, it highlights the complexity of the formalisation process related to the introduction of the French care allowance and demonstrates that formalisation, which facilitates the recognition of care as work, does not necessarily mean professionalisation. Second, it outlines the diversity of the resources available: heterogeneous professional care, semi-formal forms of care work with the possibility to employ a relative and informal family care. Finally, the analysis outlines the importance of the regulation of cash payments on the reshaping of formal and informal care and comments on its impact on the redefinition of informal caring activities. © 2012 Blackwell Publishing Ltd.
Development of structured ICD-10 and its application to computer-assisted ICD coding.
Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko
2010-01-01
This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.
Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.
Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang
2017-01-01
Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.
Lim, Sunghoon; Tucker, Conrad S; Kumara, Soundar
2017-02-01
The authors of this work propose an unsupervised machine learning model that has the ability to identify real-world latent infectious diseases by mining social media data. In this study, a latent infectious disease is defined as a communicable disease that has not yet been formalized by national public health institutes and explicitly communicated to the general public. Most existing approaches to modeling infectious-disease-related knowledge discovery through social media networks are top-down approaches that are based on already known information, such as the names of diseases and their symptoms. In existing top-down approaches, necessary but unknown information, such as disease names and symptoms, is mostly unidentified in social media data until national public health institutes have formalized that disease. Most of the formalizing processes for latent infectious diseases are time consuming. Therefore, this study presents a bottom-up approach for latent infectious disease discovery in a given location without prior information, such as disease names and related symptoms. Social media messages with user and temporal information are extracted during the data preprocessing stage. An unsupervised sentiment analysis model is then presented. Users' expressions about symptoms, body parts, and pain locations are also identified from social media data. Then, symptom weighting vectors for each individual and time period are created, based on their sentiment and social media expressions. Finally, latent-infectious-disease-related information is retrieved from individuals' symptom weighting vectors. Twitter data from August 2012 to May 2013 are used to validate this study. Real electronic medical records for 104 individuals, who were diagnosed with influenza in the same period, are used to serve as ground truth validation. The results are promising, with the highest precision, recall, and F 1 score values of 0.773, 0.680, and 0.724, respectively. This work uses individuals' social media messages to identify latent infectious diseases, without prior information, quicker than when the disease(s) is formalized by national public health institutes. In particular, the unsupervised machine learning model using user, textual, and temporal information in social media data, along with sentiment analysis, identifies latent infectious diseases in a given location. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Towards improving phenotype representation in OWL
2012-01-01
Background Phenotype ontologies are used in species-specific databases for the annotation of mutagenesis experiments and to characterize human diseases. The Entity-Quality (EQ) formalism is a means to describe complex phenotypes based on one or more affected entities and a quality. EQ-based definitions have been developed for many phenotype ontologies, including the Human and Mammalian Phenotype ontologies. Methods We analyze formalizations of complex phenotype descriptions in the Web Ontology Language (OWL) that are based on the EQ model, identify several representational challenges and analyze potential solutions to address these challenges. Results In particular, we suggest a novel, role-based approach to represent relational qualities such as concentration of iron in spleen, discuss its ontological foundation in the General Formal Ontology (GFO) and evaluate its representation in OWL and the benefits it can bring to the representation of phenotype annotations. Conclusion Our analysis of OWL-based representations of phenotypes can contribute to improving consistency and expressiveness of formal phenotype descriptions. PMID:23046625
Consumer Search, Rationing Rules, and the Consequence for Competition
NASA Astrophysics Data System (ADS)
Ruebeck, Christopher S.
Firms' conjectures about demand are consequential in oligopoly games. Through agent-based modeling of consumers' search for products, we can study the rationing of demand between capacity-constrained firms offering homogeneous products and explore the robustness of analytically solvable models' results. After algorithmically formalizing short-run search behavior rather than assuming a long-run average, this study predicts stronger competition in a two-stage capacity-price game.
Modeling Of Object- And Scene-Prototypes With Hierarchically Structured Classes
NASA Astrophysics Data System (ADS)
Ren, Z.; Jensch, P.; Ameling, W.
1989-03-01
The success of knowledge-based image analysis methodology and implementation tools depends largely on an appropriately and efficiently built model wherein the domain-specific context information about and the inherent structure of the observed image scene have been encoded. For identifying an object in an application environment a computer vision system needs to know firstly the description of the object to be found in an image or in an image sequence, secondly the corresponding relationships between object descriptions within the image sequence. This paper presents models of image objects scenes by means of hierarchically structured classes. Using the topovisual formalism of graph and higraph, we are currently studying principally the relational aspect and data abstraction of the modeling in order to visualize the structural nature resident in image objects and scenes, and to formalize. their descriptions. The goal is to expose the structure of image scene and the correspondence of image objects in the low level image interpretation. process. The object-based system design approach has been applied to build the model base. We utilize the object-oriented programming language C + + for designing, testing and implementing the abstracted entity classes and the operation structures which have been modeled topovisually. The reference images used for modeling prototypes of objects and scenes are from industrial environments as'well as medical applications.
NASA Astrophysics Data System (ADS)
Suárez, Abril; Chavanis, Pierre-Henri
2015-07-01
Using a generalization of the Madelung transformation, we derive the hydrodynamic representation of the Klein-Gordon-Einstein equations in the weak field limit. We consider a complex self-interacting scalar field with a λ |φ |4 potential. We study the evolution of the spatially homogeneous background in the fluid representation and derive the linearized equations describing the evolution of small perturbations in a static and in an expanding Universe. We compare the results with simplified models in which the gravitational potential is introduced by hand in the Klein-Gordon equation, and assumed to satisfy a (generalized) Poisson equation. Nonrelativistic hydrodynamic equations based on the Schrödinger-Poisson equations or on the Gross-Pitaevskii-Poisson equations are recovered in the limit c →+∞. We study the evolution of the perturbations in the matter era using the nonrelativistic limit of our formalism. Perturbations whose wavelength is below the Jeans length oscillate in time while perturbations whose wavelength is above the Jeans length grow linearly with the scale factor as in the cold dark matter model. The growth of perturbations in the scalar field model is substantially faster than in the cold dark matter model. When the wavelength of the perturbations approaches the cosmological horizon (Hubble length), a relativistic treatment is mandatory. In that case, we find that relativistic effects attenuate or even prevent the growth of perturbations. This paper exposes the general formalism and provides illustrations in simple cases. Other applications of our formalism will be considered in companion papers.
Software Safety Analysis of a Flight Guidance System
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Planned Organizational Change; A Study in Change Dynamics.
ERIC Educational Resources Information Center
Jones, Garth N.
This study attempts to develop a broad model or concept, based largely on empirical evidence, which applies social science knowledge and methodology to the planning of change in corporations, armies, schools, hospitals, government, community groups, and other formal and informal organizations. Chapters 2,3, and 4 define and discuss the roles of…
Formal analysis of temporal dynamics in anxiety states and traits for virtual patients
NASA Astrophysics Data System (ADS)
Aziz, Azizi Ab; Ahmad, Faudziah; Yusof, Nooraini; Ahmad, Farzana Kabir; Yusof, Shahrul Azmi Mohd
2016-08-01
This paper presents a temporal dynamic model of anxiety states and traits for an individual. Anxiety is a natural part of life, and most of us experience it from time to time. But for some people, anxiety can be extreme. Based on several personal characteristics, traits, and a representation of events (i.e. psychological and physiological stressors), the formal model can represent whether a human that experience certain scenarios will fall into an anxiety states condition. A number of well-known relations between events and the course of anxiety are summarized from the literature and it is shown that the model exhibits those patterns. In addition, the formal model has been mathematically analyzed to find out which stable situations exist. Finally, it is pointed out how this model can be used in therapy, supported by a software agent.
Farnese, M L; Barbieri, B; Bellò, B; Bartone, P T
2017-01-01
Within a Job Demands-Resources Model framework, formal mentoring can be conceived as a job resource expressing the organization's support for new members, which may prevent their being at risk for burnout. This research aims at understanding the protective role of formal mentoring on burnout, through the effect of increasing learning personal resources. Specifically, we hypothesized that formal mentoring enhances newcomers' learning about job and social domains related to the new work context, thus leading to lower burnout. In order to test the hypotheses, a multiple regression analysis using the bootstrapping method was used. Based on a questionnaire administered to 117 correctional officer newcomers who had a formal mentor assigned, our results confirm that formal mentoring exerts a positive influence on newcomers' adjustment, and that this in turn exerts a protective influence against burnout onset by reducing cynicism and interpersonal stress and also enhancing the sense of personal accomplishment. Confirming previous literature's suggestions, supportive mentoring and effective socialization seem to represent job and personal resources that are protective against burnout. This study provides empirical support for this relation in the prison context.
ERIC Educational Resources Information Center
Grabinska, Teresa; Zielinska, Dorota
2010-01-01
The authors examine language from the perspective of models of empirical sciences, which discipline studies the relationship between reality, models, and formalisms. Such a perspective allows one to notice that linguistics approached within the classical framework share a number of problems with other experimental sciences studied initially…
Rapid Frequency Chirps of TAE mode due to Finite Orbit Energetic Particles
NASA Astrophysics Data System (ADS)
Berk, Herb; Wang, Ge
2013-10-01
The tip model for the TAE mode in the large aspect ratio limit, conceived by Rosenbluth et al. in the frequency domain, together with an interaction term in the frequency domain based on a map model, has been extended into the time domain. We present the formal basis for the model, starting with the Lagrangian for the particle wave interaction. We shall discuss the formal nonlinear time domain problem and the procedure that needs to obtain solutions in the adiabatic limit.
Cheema, Jitender Jit Singh; Sankpal, Narendra V; Tambe, Sanjeev S; Kulkarni, Bhaskar D
2002-01-01
This article presents two hybrid strategies for the modeling and optimization of the glucose to gluconic acid batch bioprocess. In the hybrid approaches, first a novel artificial intelligence formalism, namely, genetic programming (GP), is used to develop a process model solely from the historic process input-output data. In the next step, the input space of the GP-based model, representing process operating conditions, is optimized using two stochastic optimization (SO) formalisms, viz., genetic algorithms (GAs) and simultaneous perturbation stochastic approximation (SPSA). These SO formalisms possess certain unique advantages over the commonly used gradient-based optimization techniques. The principal advantage of the GP-GA and GP-SPSA hybrid techniques is that process modeling and optimization can be performed exclusively from the process input-output data without invoking the detailed knowledge of the process phenomenology. The GP-GA and GP-SPSA techniques have been employed for modeling and optimization of the glucose to gluconic acid bioprocess, and the optimized process operating conditions obtained thereby have been compared with those obtained using two other hybrid modeling-optimization paradigms integrating artificial neural networks (ANNs) and GA/SPSA formalisms. Finally, the overall optimized operating conditions given by the GP-GA method, when verified experimentally resulted in a significant improvement in the gluconic acid yield. The hybrid strategies presented here are generic in nature and can be employed for modeling and optimization of a wide variety of batch and continuous bioprocesses.
Understanding Patchy Landscape Dynamics: Towards a Landscape Language
Gaucherel, Cédric; Boudon, Frédéric; Houet, Thomas; Castets, Mathieu; Godin, Christophe
2012-01-01
Patchy landscapes driven by human decisions and/or natural forces are still a challenge to be understood and modelled. No attempt has been made up to now to describe them by a coherent framework and to formalize landscape changing rules. Overcoming this lacuna was our first objective here, and this was largely based on the notion of Rewriting Systems, also called Formal Grammars. We used complicated scenarios of agricultural dynamics to model landscapes and to write their corresponding driving rule equations. Our second objective was to illustrate the relevance of this landscape language concept for landscape modelling through various grassland managements, with the final aim to assess their respective impacts on biological conservation. For this purpose, we made the assumptions that a higher grassland appearance frequency and higher land cover connectivity are favourable to species conservation. Ecological results revealed that dairy and beef livestock production systems are more favourable to wild species than is hog farming, although in different ways. Methodological results allowed us to efficiently model and formalize these landscape dynamics. This study demonstrates the applicability of the Rewriting System framework to the modelling of agricultural landscapes and, hopefully, to other patchy landscapes. The newly defined grammar is able to explain changes that are neither necessarily local nor Markovian, and opens a way to analytical modelling of landscape dynamics. PMID:23049935
Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management
A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...
NASA Astrophysics Data System (ADS)
Nomura, Takuji
2017-10-01
We study two-magnon excitations in resonant inelastic x-ray scattering (RIXS) at the transition-metal K edge. Instead of working with effective Heisenberg spin models, we work with a Hubbard-type model (d -p model) for a typical insulating cuprate La2CuO4 . For the antiferromagnetic ground state within the spin density wave (SDW) mean-field formalism, we calculate the dynamical correlation function within the random-phase approximation (RPA), and then obtain two-magnon excitation spectra by calculating the convolution of it. Coupling between the K -shell hole and the magnons in the intermediate state is calculated by means of diagrammatic perturbation expansion in the Coulomb interaction. The calculated momentum dependence of RIXS spectra agrees well with that of experiments. A notable difference from previous calculations based on the Heisenberg spin models is that RIXS spectra have a large two-magnon weight near the zone center, which may be confirmed by further careful high-resolution experiments.
Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James
2018-01-01
Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE's understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Conti, C.; Barbero, C.; Galeão, A. P.
In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.
Ding, Hansheng; Wang, Changying; Xie, Chunyan; Yang, Yitong; Jin, Chunlin
2017-01-01
The need for formal care among the elderly population has been increasing due to their greater longevity and the evolution of family structure. We examined the determinants of the use and expenses of formal care among in-home elderly adults in Shanghai. A two-part model based on the data from the Shanghai Long-Term Care Needs Assessment Questionnaire was applied. A total of 8428 participants responded in 2014 and 7100 were followed up in 2015. The determinants of the probability of using formal care were analyzed in the first part of the model and the determinants of formal care expenses were analyzed in the second part. Demographic indicators, living arrangements, physical health status, and care type in 2014 were selected as independent variables. We found that individuals of older age; women; those with higher Activities of Daily Living (ADL) scores; those without spouse; those with higher income; those suffering from stroke, dementia, lower limb fracture, or advanced tumor; and those with previous experience of formal and informal care were more likely to receive formal care in 2015. Furthermore, age, income and formal care fee in 2014 were significant predictors of formal care expenses in 2015. Taken together, the results showed that formal care provision in Shanghai was not determined by ADL scores, but was instead more related to income. This implied an inappropriate distribution of formal care among elderly population in Shanghai. Additionally, it appeared difficult for the elderly to quit the formal care once they begun to use it. These results highlighted the importance of assessing the need for formal care, and suggested that the government offer guidance on formal care use for the elderly. PMID:28448628
Making a Virtue out of a Necessity: Part Time Work as a Site for Undergraduate Work-Based Learning
ERIC Educational Resources Information Center
Shaw, Sue; Ogilvie, Chrissy
2010-01-01
Purpose: This paper seeks to challenge the view that student part time employment detracts from academic attainment and presents evidence that when linked to formal undergraduate study provides rich learning experiences. It also explores the extent to which formerly accepted pre-requisites for work based learning (WBL) apply in this model and how…
NASA Astrophysics Data System (ADS)
Geslin, Pierre-Antoine; Gatti, Riccardo; Devincre, Benoit; Rodney, David
2017-11-01
We propose a framework to study thermally-activated processes in dislocation glide. This approach is based on an implementation of the nudged elastic band method in a nodal mesoscale dislocation dynamics formalism. Special care is paid to develop a variational formulation to ensure convergence to well-defined minimum energy paths. We also propose a methodology to rigorously parametrize the model on atomistic data, including elastic, core and stacking fault contributions. To assess the validity of the model, we investigate the homogeneous nucleation of partial dislocation loops in aluminum, recovering the activation energies and loop shapes obtained with atomistic calculations and extending these calculations to lower applied stresses. The present method is also applied to heterogeneous nucleation on spherical inclusions.
Problem-based learning: effects on student’s scientific reasoning skills in science
NASA Astrophysics Data System (ADS)
Wulandari, F. E.; Shofiyah, N.
2018-04-01
This research aimed to develop instructional package of problem-based learning to enhance student’s scientific reasoning from concrete to formal reasoning skills level. The instructional package was developed using the Dick and Carey Model. Subject of this study was instructional package of problem-based learning which was consisting of lesson plan, handout, student’s worksheet, and scientific reasoning test. The instructional package was tried out on 4th semester science education students of Universitas Muhammadiyah Sidoarjo by using the one-group pre-test post-test design. The data of scientific reasoning skills was collected by making use of the test. The findings showed that the developed instructional package reflecting problem-based learning was feasible to be implemented in classroom. Furthermore, through applying the problem-based learning, students could dominate formal scientific reasoning skills in terms of functionality and proportional reasoning, control variables, and theoretical reasoning.
Trust models in ubiquitous computing.
Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro
2008-10-28
We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.
Reciprocal relations between cognitive neuroscience and formal cognitive models: opposites attract?
Forstmann, Birte U; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T
2011-06-01
Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal cognitive models can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of formal cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent; not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide key insights into formal models of cognition. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia
2010-01-01
Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…
Alves, Rui; Vilaprinyo, Ester; Hernádez-Bermejo, Benito; Sorribas, Albert
2008-01-01
There is a renewed interest in obtaining a systemic understanding of metabolism, gene expression and signal transduction processes, driven by the recent research focus on Systems Biology. From a biotechnological point of view, such a systemic understanding of how a biological system is designed to work can facilitate the rational manipulation of specific pathways in different cell types to achieve specific goals. Due to the intrinsic complexity of biological systems, mathematical models are a central tool for understanding and predicting the integrative behavior of those systems. Particularly, models are essential for a rational development of biotechnological applications and in understanding system's design from an evolutionary point of view. Mathematical models can be obtained using many different strategies. In each case, their utility will depend upon the properties of the mathematical representation and on the possibility of obtaining meaningful parameters from available data. In practice, there are several issues at stake when one has to decide which mathematical model is more appropriate for the study of a given problem. First, one needs a model that can represent the aspects of the system one wishes to study. Second, one must choose a mathematical representation that allows an accurate analysis of the system with respect to different aspects of interest (for example, robustness of the system, dynamical behavior, optimization of the system with respect to some production goal, parameter value determination, etc). Third, before choosing between alternative and equally appropriate mathematical representations for the system, one should compare representations with respect to easiness of automation for model set-up, simulation, and analysis of results. Fourth, one should also consider how to facilitate model transference and re-usability by other researchers and for distinct purposes. Finally, one factor that is important for all four aspects is the regularity in the mathematical structure of the equations because it facilitates computational manipulation. This regularity is a mark of kinetic representations based on approximation theory. The use of approximation theory to derive mathematical representations with regular structure for modeling purposes has a long tradition in science. In most applied fields, such as engineering and physics, those approximations are often required to obtain practical solutions to complex problems. In this paper we review some of the more popular mathematical representations that have been derived using approximation theory and are used for modeling in molecular systems biology. We will focus on formalisms that are theoretically supported by the Taylor Theorem. These include the Power-law formalism, the recently proposed (log)linear and Lin-log formalisms as well as some closely related alternatives. We will analyze the similarities and differences between these formalisms, discuss the advantages and limitations of each representation, and provide a tentative "road map" for their potential utilization for different problems.
2017-09-01
AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Test and...ambiguities and identify high -value decision points? This thesis explores how formalization of these experience-based decisions as a process model...representing a T&E event may reveal high -value decision nodes where certain decisions carry more weight or potential for impacts to a successful test. The
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.
Sedwards, Sean; Mazza, Tommaso
2007-10-15
Compartments and membranes are the basis of cell topology and more than 30% of the human genome codes for membrane proteins. While it is possible to represent compartments and membrane proteins in a nominal way with many mathematical formalisms used in systems biology, few, if any, explicitly model the topology of the membranes themselves. Discrete stochastic simulation potentially offers the most accurate representation of cell dynamics. Since the details of every molecular interaction in a pathway are often not known, the relationship between chemical species in not necessarily best described at the lowest level, i.e. by mass action. Simulation is a form of computer-aided analysis, relying on human interpretation to derive meaning. To improve efficiency and gain meaning in an automatic way, it is necessary to have a formalism based on a model which has decidable properties. We present Cyto-Sim, a stochastic simulator of membrane-enclosed hierarchies of biochemical processes, where the membranes comprise an inner, outer and integral layer. The underlying model is based on formal language theory and has been shown to have decidable properties (Cavaliere and Sedwards, 2006), allowing formal analysis in addition to simulation. The simulator provides variable levels of abstraction via arbitrary chemical kinetics which link to ordinary differential equations. In addition to its compact native syntax, Cyto-Sim currently supports models described as Petri nets, can import all versions of SBML and can export SBML and MATLAB m-files. Cyto-Sim is available free, either as an applet or a stand-alone Java program via the web page (http://www.cosbi.eu/Rpty_Soft_CytoSim.php). Other versions can be made available upon request.
Exploring the sustainability of industrial production and energy generation with a model system
The importance and complexity of sustainability has been well recognized and a formal study of sustainability based on system theory approaches is imperative as many of the relationships between the various components of the system could be non-linear, intertwined, and non-intuit...
Pilot interaction with automated airborne decision making systems
NASA Technical Reports Server (NTRS)
Rouse, W. B.
1981-01-01
The role of the pilot and crew for future aircraft is discussed. Fifteen formal experimental studies and the development of a variety of models of human behavior based on queueing history, pattern recognition methods, control theory, fuzzy set theory, and artificial intelligence concepts are presented. L.F.M.
Music Education in the Republic of Indonesia: A Model of Cultural Pluralism.
ERIC Educational Resources Information Center
Goolsby, Thomas W.
Based on formal and informal observations of classrooms at all levels of government and private Islamic schools, as well as formal and informal interviews with students, teachers, administrators, and parents, this paper was the result of experiences as an Fulbright-Hays lecturer in Indonesia. The Indonesian Department of Education and Culture…
Indicators of Informal and Formal Decision-Making about a Socioscientific Issue
ERIC Educational Resources Information Center
Dauer, Jenny M.; Lute, Michelle L.; Straka, Olivia
2017-01-01
We propose two contrasting types of student decision-making based on social and cognitive psychology models of separate mental processes for problem solving. Informal decision-making uses intuitive reasoning and is subject to cognitive biases, whereas formal decision-making uses effortful, logical reasoning. We explored indicators of students'…
Developing Formal Correctness Properties from Natural Language Requirements
NASA Technical Reports Server (NTRS)
Nikora, Allen P.
2006-01-01
This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Model-based analyses: Promises, pitfalls, and example applications to the study of cognitive control
Mars, Rogier B.; Shea, Nicholas J.; Kolling, Nils; Rushworth, Matthew F. S.
2011-01-01
We discuss a recent approach to investigating cognitive control, which has the potential to deal with some of the challenges inherent in this endeavour. In a model-based approach, the researcher defines a formal, computational model that performs the task at hand and whose performance matches that of a research participant. The internal variables in such a model might then be taken as proxies for latent variables computed in the brain. We discuss the potential advantages of such an approach for the study of the neural underpinnings of cognitive control and its pitfalls, and we make explicit the assumptions underlying the interpretation of data obtained using this approach. PMID:20437297
BRST Quantization of the Proca Model Based on the BFT and the BFV Formalism
NASA Astrophysics Data System (ADS)
Kim, Yong-Wan; Park, Mu-In; Park, Young-Jai; Yoon, Sean J.
The BRST quantization of the Abelian Proca model is performed using the Batalin-Fradkin-Tyutin and the Batalin-Fradkin-Vilkovisky formalism. First, the BFT Hamiltonian method is applied in order to systematically convert a second class constraint system of the model into an effectively first class one by introducing new fields. In finding the involutive Hamiltonian we adopt a new approach which is simpler than the usual one. We also show that in our model the Dirac brackets of the phase space variables in the original second class constraint system are exactly the same as the Poisson brackets of the corresponding modified fields in the extended phase space due to the linear character of the constraints comparing the Dirac or Faddeev-Jackiw formalisms. Then, according to the BFV formalism we obtain that the desired resulting Lagrangian preserving BRST symmetry in the standard local gauge fixing procedure naturally includes the Stückelberg scalar related to the explicit gauge symmetry breaking effect due to the presence of the mass term. We also analyze the nonstandard nonlocal gauge fixing procedure.
On the adequacy of current empirical evaluations of formal models of categorization.
Wills, Andy J; Pothos, Emmanuel M
2012-01-01
Categorization is one of the fundamental building blocks of cognition, and the study of categorization is notable for the extent to which formal modeling has been a central and influential component of research. However, the field has seen a proliferation of noncomplementary models with little consensus on the relative adequacy of these accounts. Progress in assessing the relative adequacy of formal categorization models has, to date, been limited because (a) formal model comparisons are narrow in the number of models and phenomena considered and (b) models do not often clearly define their explanatory scope. Progress is further hampered by the practice of fitting models with arbitrarily variable parameters to each data set independently. Reviewing examples of good practice in the literature, we conclude that model comparisons are most fruitful when relative adequacy is assessed by comparing well-defined models on the basis of the number and proportion of irreversible, ordinal, penetrable successes (principles of minimal flexibility, breadth, good-enough precision, maximal simplicity, and psychological focus).
Spanish Velar-Insertion and Analogy: A Usage-Based Diachronic Analysis
ERIC Educational Resources Information Center
Fondow, Steven Richard
2010-01-01
The theory of Analogical and Exemplar Modeling (AEM) suggests renewed discussion of the formalization of analogy and its possible incorporation in linguistic theory. AEM is a usage-based model founded upon Exemplar Modeling (Bybee 2007, Pierrehumbert 2001) that utilizes several principles of the Analogical Modeling of Language (Skousen 1992, 1995,…
One Giant Leap for Categorizers: One Small Step for Categorization Theory
Smith, J. David; Ell, Shawn W.
2015-01-01
We explore humans’ rule-based category learning using analytic approaches that highlight their psychological transitions during learning. These approaches confirm that humans show qualitatively sudden psychological transitions during rule learning. These transitions contribute to the theoretical literature contrasting single vs. multiple category-learning systems, because they seem to reveal a distinctive learning process of explicit rule discovery. A complete psychology of categorization must describe this learning process, too. Yet extensive formal-modeling analyses confirm that a wide range of current (gradient-descent) models cannot reproduce these transitions, including influential rule-based models (e.g., COVIS) and exemplar models (e.g., ALCOVE). It is an important theoretical conclusion that existing models cannot explain humans’ rule-based category learning. The problem these models have is the incremental algorithm by which learning is simulated. Humans descend no gradient in rule-based tasks. Very different formal-modeling systems will be required to explain humans’ psychology in these tasks. An important next step will be to build a new generation of models that can do so. PMID:26332587
2014-01-01
Background Scientific publications are documentary representations of defeasible arguments, supported by data and repeatable methods. They are the essential mediating artifacts in the ecosystem of scientific communications. The institutional “goal” of science is publishing results. The linear document publication format, dating from 1665, has survived transition to the Web. Intractable publication volumes; the difficulty of verifying evidence; and observed problems in evidence and citation chains suggest a need for a web-friendly and machine-tractable model of scientific publications. This model should support: digital summarization, evidence examination, challenge, verification and remix, and incremental adoption. Such a model must be capable of expressing a broad spectrum of representational complexity, ranging from minimal to maximal forms. Results The micropublications semantic model of scientific argument and evidence provides these features. Micropublications support natural language statements; data; methods and materials specifications; discussion and commentary; challenge and disagreement; as well as allowing many kinds of statement formalization. The minimal form of a micropublication is a statement with its attribution. The maximal form is a statement with its complete supporting argument, consisting of all relevant evidence, interpretations, discussion and challenges brought forward in support of or opposition to it. Micropublications may be formalized and serialized in multiple ways, including in RDF. They may be added to publications as stand-off metadata. An OWL 2 vocabulary for micropublications is available at http://purl.org/mp. A discussion of this vocabulary along with RDF examples from the case studies, appears as OWL Vocabulary and RDF Examples in Additional file 1. Conclusion Micropublications, because they model evidence and allow qualified, nuanced assertions, can play essential roles in the scientific communications ecosystem in places where simpler, formalized and purely statement-based models, such as the nanopublications model, will not be sufficient. At the same time they will add significant value to, and are intentionally compatible with, statement-based formalizations. We suggest that micropublications, generated by useful software tools supporting such activities as writing, editing, reviewing, and discussion, will be of great value in improving the quality and tractability of biomedical communications. PMID:26261718
NASA Astrophysics Data System (ADS)
Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.
2018-05-01
We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Ultrafast dynamics and decoherence of quasiparticles in surface bands: Development of the formalism
NASA Astrophysics Data System (ADS)
Gumhalter, Branko
2005-10-01
We describe a formalism suitable for studying the ultrafast dynamics and nonadiabatic effects associated with propagation of a single electron injected into an empty band. Within the band the electron is coupled to vibrational or electronic excitations that can be modeled by bosons. The formalism is based on the application of cumulant expansion to calculations of diagonal single particle propagators that are used in the interpretations of time resolved measurements of the surface electronic structure. Second and fourth order cumulants which arise from linear coupling to bosonic excitations and give leading contributions to the renormalization of propagators are explicitly calculated in the real time domain and their properties analyzed. This approach enables the assessment of transient effects and energy transfer associated with nonadiabatic response of the system to promotion of electrons into unoccupied bands, as well as of higher order corrections to the lifetimes and energy shifts of the initial electronic states that in the adiabatic regime are obtained from Fermi’s golden rule approach or its improvements such as the GW approximation. In the form presented the formalism is particularly suitable for studying the non-Markovian evolution and ultrafast decoherence of electronic states encountered in electron spectroscopies of quasi-two-dimensional bands on metal surfaces whose descriptions are inaccessible to the approaches based on the adiabatic hypothesis. The fast convergence of the results obtained by this procedure is demonstrated for a simple model system relevant to surface problems. On the basis of this and some general properties of cumulants it is argued that in the majority of surface problems involving electron-boson interactions the ultrafast dynamics of quasiparticles is accurately described by the second order cumulant, which can be calculated with the effort not exceeding those encountered in the standard GW approximation calculations.
KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.
Mathew, Joseph L
2011-04-01
Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.
Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.
Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk
2012-02-01
Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.
Xiang, Junfeng; Xie, Lijing; Gao, Feinong; Zhang, Yu; Yi, Jie; Wang, Tao; Pang, Siqin; Wang, Xibin
2018-01-01
Discrepancies in capturing material behavior of some materials, such as Particulate Reinforced Metal Matrix Composites, by using conventional ad hoc strategy make the applicability of Johnson-Cook constitutive model challenged. Despites applicable efforts, its extended formalism with more fitting parameters would increase the difficulty in identifying constitutive parameters. A weighted multi-objective strategy for identifying any constitutive formalism is developed to predict mechanical behavior in static and dynamic loading conditions equally well. These varying weighting is based on the Gaussian-distributed noise evaluation of experimentally obtained stress-strain data in quasi-static or dynamic mode. This universal method can be used to determine fast and directly whether the constitutive formalism is suitable to describe the material constitutive behavior by measuring goodness-of-fit. A quantitative comparison of different fitting strategies on identifying Al6063/SiCp’s material parameters is made in terms of performance evaluation including noise elimination, correlation, and reliability. Eventually, a three-dimensional (3D) FE model in small-hole drilling of Al6063/SiCp composites, using multi-objective identified constitutive formalism, is developed. Comparison with the experimental observations in thrust force, torque, and chip morphology provides valid evidence on the applicability of the developed multi-objective identification strategy in identifying constitutive parameters. PMID:29324688
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yavari, M., E-mail: yavari@iaukashan.ac.ir
2016-06-15
We generalize the results of Nesterenko [13, 14] and Gogilidze and Surovtsev [15] for DNA structures. Using the generalized Hamiltonian formalism, we investigate solutions of the equilibrium shape equations for the linear free energy model.
Learning Goal Orientation, Formal Mentoring, and Leadership Competence in HRD: A Conceptual Model
ERIC Educational Resources Information Center
Kim, Sooyoung
2007-01-01
Purpose: The purpose of this paper is to suggest a conceptual model of formal mentoring as a leadership development initiative including "learning goal orientation", "mentoring functions", and "leadership competencies" as key constructs of the model. Design/methodology/approach: Some empirical studies, though there are not many, will provide…
Dual Rationality and Deliberative Agents
NASA Astrophysics Data System (ADS)
Debenham, John; Sierra, Carles
Human agents deliberate using models based on reason for only a minute proportion of the decisions that they make. In stark contrast, the deliberation of artificial agents is heavily dominated by formal models based on reason such as game theory, decision theory and logic—despite that fact that formal reasoning will not necessarily lead to superior real-world decisions. Further the Nobel Laureate Friedrich Hayek warns us of the ‘fatal conceit’ in controlling deliberative systems using models based on reason as the particular model chosen will then shape the system’s future and either impede, or eventually destroy, the subtle evolutionary processes that are an integral part of human systems and institutions, and are crucial to their evolution and long-term survival. We describe an architecture for artificial agents that is founded on Hayek’s two rationalities and supports the two forms of deliberation used by mankind.
Schwinger-Keldysh diagrammatics for primordial perturbations
NASA Astrophysics Data System (ADS)
Chen, Xingang; Wang, Yi; Xianyu, Zhong-Zhi
2017-12-01
We present a systematic introduction to the diagrammatic method for practical calculations in inflationary cosmology, based on Schwinger-Keldysh path integral formalism. We show in particular that the diagrammatic rules can be derived directly from a classical Lagrangian even in the presence of derivative couplings. Furthermore, we use a quasi-single-field inflation model as an example to show how this formalism, combined with the trick of mixed propagator, can significantly simplify the calculation of some in-in correlation functions. The resulting bispectrum includes the lighter scalar case (m<3H/2) that has been previously studied, and the heavier scalar case (m>3H/2) that has not been explicitly computed for this model. The latter provides a concrete example of quantum primordial standard clocks, in which the clock signals can be observably large.
The Design of Model-Based Training Programs
NASA Technical Reports Server (NTRS)
Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)
1997-01-01
This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.
Closing the Gap between Formalism and Application--PBL and Mathematical Skills in Engineering
ERIC Educational Resources Information Center
Christensen, Ole Ravn
2008-01-01
A common problem in learning mathematics concerns the gap between, on the one hand, doing the formalisms and calculations of abstract mathematics and, on the other hand, applying these in a specific contextualized setting for example the engineering world. The skills acquired through problem-based learning (PBL), in the special model used at…
Terfve, Camille; Cokelaer, Thomas; Henriques, David; MacNamara, Aidan; Goncalves, Emanuel; Morris, Melody K; van Iersel, Martijn; Lauffenburger, Douglas A; Saez-Rodriguez, Julio
2012-10-18
Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context.
2012-01-01
Background Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Results Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Conclusions Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context. PMID:23079107
Model Checking JAVA Programs Using Java Pathfinder
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Pressburger, Thomas
2000-01-01
This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.
A quasi-current representation for information needs inspired by Two-State Vector Formalism
NASA Astrophysics Data System (ADS)
Wang, Panpan; Hou, Yuexian; Li, Jingfei; Zhang, Yazhou; Song, Dawei; Li, Wenjie
2017-09-01
Recently, a number of quantum theory (QT)-based information retrieval (IR) models have been proposed for modeling session search task that users issue queries continuously in order to describe their evolving information needs (IN). However, the standard formalism of QT cannot provide a complete description for users' current IN in a sense that it does not take the 'future' information into consideration. Therefore, to seek a more proper and complete representation for users' IN, we construct a representation of quasi-current IN inspired by an emerging Two-State Vector Formalism (TSVF). With the enlightenment of the completeness of TSVF, a "two-state vector" derived from the 'future' (the current query) and the 'history' (the previous query) is employed to describe users' quasi-current IN in a more complete way. Extensive experiments are conducted on the session tracks of TREC 2013 & 2014, and show that our model outperforms a series of compared IR models.
A Policy Language for Modelling Recommendations
NASA Astrophysics Data System (ADS)
Abou El Kalam, Anas; Balbiani, Philippe
While current and emergent applications become more and more complex, most of existing security policies and models only consider a yes/no response to the access requests. Consequently, modelling, formalizing and implementing permissions, obligations and prohibitions do not cover the richness of all the possible scenarios. In fact, several applications have access rules with the recommendation access modality. In this paper we focus on the problem of formalizing security policies with recommendation needs. The aim is to provide a generic domain-independent formal system for modelling not only permissions, prohibitions and obligations, but also recommendations. In this respect, we present our logic-based language, the semantics, the truth conditions, our axiomatic as well as inference rules. We also give a representative use case with our specification of recommendation requirements. Finally, we explain how our logical framework could be used to query the security policy and to check its consistency.
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
NASA Technical Reports Server (NTRS)
Callender, E. D.; Farny, A. M.
1983-01-01
Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.
From Informal Safety-Critical Requirements to Property-Driven Formal Validation
NASA Technical Reports Server (NTRS)
Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano
2008-01-01
Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.
ERIC Educational Resources Information Center
Landolfi, Adrienne M.
2016-01-01
As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…
NASA Astrophysics Data System (ADS)
Towne, Forrest S.
Current domestic and international comparative studies of student achievement in science are demonstrating that the U.S. needs to improve science education if it wants to remain competitive in the global economy. One of the causes of the poor performance of U.S. science education is the lack of students who have developed the formal thinking skills that are necessary to obtain scientific literacy. Previous studies have demonstrated that formal thinking skills can be taught to adolescents, however only 25% of incoming college freshman have these necessary skills. There is some evidence that adolescence (girls aged 11-13, boys aged 12-14) is a critical period where students must learn formal thinking skills, similar to the critical period that exists for young children learning languages. It is not known whether it is more difficult for students to learn formal thinking skills either prior to or following adolescence. The purpose of this quantitative case study is to determine whether adolescence is a critical period for students to learn formal thinking skills. The study also investigates whether a formal thinking skills focused program can improve students' intelligence. In this study 32 students who had not developed any formal thinking skills, ranging in age from 10-16, underwent an intensive four-week, inquiry-based, formal thinking skill intervention program that focused on two formal thinking skills: (1) the ability to control and exclude variables; and (2) the ability to manipulate ratios and proportionalities. The students undergoing the training were matched with control students by age, gender, formal thinking skill ability, and intelligence. The control group attended their traditional science course during the intervention periods. The results of the study showed that the intervention program was successful in developing students' formal thinking skills. The pre-adolescents (males, age 10-11, females, age 10) were unable to learn formal thinking skills. The data indicated that there is not a significant difference between adolescents and post-adolescents (up to 16-years-old) ability to learn formal thinking skills. Both groups (adolescent and post-adolescent) showed improvement in their formal thinking skill ability after the intervention. The intervention also demonstrated evidence of improving students' intelligence scores.
Hyper-Book: A Formal Model for Electronic Books.
ERIC Educational Resources Information Center
Catenazzi, Nadia; Sommaruga, Lorenzo
1994-01-01
Presents a model for electronic books based on the paper book metaphor. Discussion includes how the book evolves under the effects of its functional components; the use and impact of the model for organizing and presenting electronic documents in the context of electronic publishing; and the possible applications of a system based on the model.…
Data-based Non-Markovian Model Inference
NASA Astrophysics Data System (ADS)
Ghil, Michael
2015-04-01
This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close collaboration with M.D. Chekroun, D. Kondrashov, S. Kravtsov and A.W. Robertson.
Formal Methods Tool Qualification
NASA Technical Reports Server (NTRS)
Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain
2017-01-01
Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.
Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe
2012-01-01
The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom. PMID:22670147
Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe
2012-01-01
The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
The Stochastic Early Reaction, Inhibition, and late Action (SERIA) model for antisaccades
2017-01-01
The antisaccade task is a classic paradigm used to study the voluntary control of eye movements. It requires participants to suppress a reactive eye movement to a visual target and to concurrently initiate a saccade in the opposite direction. Although several models have been proposed to explain error rates and reaction times in this task, no formal model comparison has yet been performed. Here, we describe a Bayesian modeling approach to the antisaccade task that allows us to formally compare different models on the basis of their evidence. First, we provide a formal likelihood function of actions (pro- and antisaccades) and reaction times based on previously published models. Second, we introduce the Stochastic Early Reaction, Inhibition, and late Action model (SERIA), a novel model postulating two different mechanisms that interact in the antisaccade task: an early GO/NO-GO race decision process and a late GO/GO decision process. Third, we apply these models to a data set from an experiment with three mixed blocks of pro- and antisaccade trials. Bayesian model comparison demonstrates that the SERIA model explains the data better than competing models that do not incorporate a late decision process. Moreover, we show that the early decision process postulated by the SERIA model is, to a large extent, insensitive to the cue presented in a single trial. Finally, we use parameter estimates to demonstrate that changes in reaction time and error rate due to the probability of a trial type (pro- or antisaccade) are best explained by faster or slower inhibition and the probability of generating late voluntary prosaccades. PMID:28767650
On the Adequacy of Current Empirical Evaluations of Formal Models of Categorization
ERIC Educational Resources Information Center
Wills, Andy J.; Pothos, Emmanuel M.
2012-01-01
Categorization is one of the fundamental building blocks of cognition, and the study of categorization is notable for the extent to which formal modeling has been a central and influential component of research. However, the field has seen a proliferation of noncomplementary models with little consensus on the relative adequacy of these accounts.…
Short-Term Memory Scanning Viewed as Exemplar-Based Categorization
ERIC Educational Resources Information Center
Nosofsky, Robert M.; Little, Daniel R.; Donkin, Christopher; Fific, Mario
2011-01-01
Exemplar-similarity models such as the exemplar-based random walk (EBRW) model (Nosofsky & Palmeri, 1997b) were designed to provide a formal account of multidimensional classification choice probabilities and response times (RTs). At the same time, a recurring theme has been to use exemplar models to account for old-new item recognition and to…
A Multinomial Model of Event-Based Prospective Memory
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2004-01-01
Prospective memory is remembering to perform an action in the future. The authors introduce the 1st formal model of event-based prospective memory, namely, a multinomial model that includes 2 separate parameters related to prospective memory processes. The 1st measures preparatory attentional processes, and the 2nd measures retrospective memory…
Medical Named Entity Recognition for Indonesian Language Using Word Representations
NASA Astrophysics Data System (ADS)
Rahman, Arief
2018-03-01
Nowadays, Named Entity Recognition (NER) system is used in medical texts to obtain important medical information, like diseases, symptoms, and drugs. While most NER systems are applied to formal medical texts, informal ones like those from social media (also called semi-formal texts) are starting to get recognition as a gold mine for medical information. We propose a theoretical Named Entity Recognition (NER) model for semi-formal medical texts in our medical knowledge management system by comparing two kinds of word representations: cluster-based word representation and distributed representation.
Formalizing New Navigation Requirements for NASA's Space Shuttle
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Zhang, Wei; Shmuylovich, Leonid; Kovacs, Sandor J
2009-01-01
Using a simple harmonic oscillator model (PDF formalism), every early filling E-wave can be uniquely described by a set of parameters, (x(0), c, and k). Parameter c in the PDF formalism is a damping or relaxation parameter that measures the energy loss during the filling process. Based on Bernoulli's equation and kinematic modeling, we derived a causal correlation between the relaxation parameter c in the PDF formalism and a feature of the pressure contour during filling - the pressure recovery ratio defined by the left ventricular pressure difference between diastasis and minimum pressure, normalized to the pressure difference between a fiducial pressure and minimum pressure [PRR = (P(Diastasis)-P(Min))/(P(Fiducial)-P(Min))]. We analyzed multiple heart beats from one human subject to validate the correlation. Further validation among more patients is warranted. PRR is the invasive causal analogue of the noninvasive E-wave relaxation parameter c. PRR has the potential to be calculated using automated methodology in the catheterization lab in real time.
ERIC Educational Resources Information Center
Lawlor, John; Marshall, Kevin; Tangney, Brendan
2016-01-01
It is generally accepted that intrinsic student motivation is a critical requirement for effective learning but formal learning in school places a huge reliance on extrinsic motivation to focus the learner. This reliance on extrinsic motivation is driven by the pressure on formal schooling to "deliver to the test." The experience of the…
Non-Lipschitz Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, M.; Meyers, R.
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.
NASA Astrophysics Data System (ADS)
Fiorini, Rodolfo A.; Dacquino, Gianfranco
2005-03-01
GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.
Implementation of a method for calculating temperature-dependent resistivities in the KKR formalism
NASA Astrophysics Data System (ADS)
Mahr, Carsten E.; Czerner, Michael; Heiliger, Christian
2017-10-01
We present a method to calculate the electron-phonon induced resistivity of metals in scattering-time approximation based on the nonequilibrium Green's function formalism. The general theory as well as its implementation in a density-functional theory based Korringa-Kohn-Rostoker code are described and subsequently verified by studying copper as a test system. We model the thermal expansion by fitting a Debye-Grüneisen curve to experimental data. Both the electronic and vibrational structures are discussed for different temperatures, and employing a Wannier interpolation of these quantities we evaluate the scattering time by integrating the electron linewidth on a triangulation of the Fermi surface. Based thereupon, the temperature-dependent resistivity is calculated and found to be in good agreement with experiment. We show that the effect of thermal expansion has to be considered in the whole calculation regime. Further, for low temperatures, an accurate sampling of the Fermi surface becomes important.
The Factor Structure of Concrete and Formal Operations: A Confirmation of Piaget.
ERIC Educational Resources Information Center
Gray, William M.
Piaget has hypothesized that concrete and formal operations can be described by specific logical models. The present study focused on assessing various aspects of four concrete operational groupings and two variations of two formal operational characteristics. Six hundred twenty-two 9-14 year old students participating in the Human Sciences…
Zhou, Y; Murata, T; Defanti, T A
2000-01-01
Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.
ERIC Educational Resources Information Center
de Lange, Thomas
2011-01-01
This article examines how a classroom procedure known as PGE (Plan/Go-through/Evaluate) group work aims at integrating formal and non-formal media experiences and practices into classroom-based media learning. The study displays, on the one hand, how PGE group work emerged and was institutionally embedded in a media course. On the other hand, the…
Croll, Peter R
2011-02-01
To ensure that patient confidentiality is securely maintained, health ICT applications that contain sensitive personal information demand comprehensive privacy policies. Determining the adequacy of these policies to meet legal conformity together with clinical users and patient expectation is demanding in practice. Organisations and agencies looking to analyse their Privacy and Security policies can benefit from guidance provided by outside entities such as the Privacy Office of their State or Government together with law firms and ICT specialists. The advice given is not uniform and often open to different interpretations. Of greater concern is the possibility of overlooking any important aspects that later result in a data breach. Based on three case studies, this paper considers whether a more formal approach to privacy analysis could be taken that would help identify the full coverage of a Privacy Impact Analysis and determine the deficiencies with an organisation's current policies and approach. A diagrammatic model showing the relationships between Confidentiality, Privacy, Trust, Security and Safety is introduced. First the validity of this model is determined by mapping it against the real-world case studies taken from three healthcare services that depend on ICT. Then, by using software engineering methods, a formal mapping of the relationships is undertaken to identify a full set of policies needed to satisfy the model. How effective this approach may prove as a generic method for deriving a comprehensive set of policies in health ICT applications is finally discussed. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Werner, K.; Liu, F. M.; Ostapchenko, S.; Pierog, T.
2004-11-01
After discussing conceptual problems with the conventional string model, we present a new approach, based on a theoretically consistent multiple scattering formalism. First results for proton-proton scattering at 158 GeV are discussed.
A Game Theoretic Model of Thermonuclear Cyberwar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soper, Braden C.
In this paper we propose a formal game theoretic model of thermonuclear cyberwar based on ideas found in [1] and [2]. Our intention is that such a game will act as a first step toward building more complete formal models of Cross-Domain Deterrence (CDD). We believe the proposed thermonuclear cyberwar game is an ideal place to start on such an endeavor because the game can be fashioned in a way that is closely related to the classical models of nuclear deterrence [4–6], but with obvious modifications that will help to elucidate the complexities introduced by a second domain. We startmore » with the classical bimatrix nuclear deterrence game based on the game of chicken, but introduce uncertainty via a left-of-launch cyber capability that one or both players may possess.« less
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Decker, Johannes H.; Otto, A. Ross; Daw, Nathaniel D.; Hartley, Catherine A.
2016-01-01
Theoretical models distinguish two decision-making strategies that have been formalized in reinforcement-learning theory. A model-based strategy leverages a cognitive model of potential actions and their consequences to make goal-directed choices, whereas a model-free strategy evaluates actions based solely on their reward history. Research in adults has begun to elucidate the psychological mechanisms and neural substrates underlying these learning processes and factors that influence their relative recruitment. However, the developmental trajectory of these evaluative strategies has not been well characterized. In this study, children, adolescents, and adults, performed a sequential reinforcement-learning task that enables estimation of model-based and model-free contributions to choice. Whereas a model-free strategy was evident in choice behavior across all age groups, evidence of a model-based strategy only emerged during adolescence and continued to increase into adulthood. These results suggest that recruitment of model-based valuation systems represents a critical cognitive component underlying the gradual maturation of goal-directed behavior. PMID:27084852
Colloquium: Modeling the dynamics of multicellular systems: Application to tissue engineering
NASA Astrophysics Data System (ADS)
Kosztin, Ioan; Vunjak-Novakovic, Gordana; Forgacs, Gabor
2012-10-01
Tissue engineering is a rapidly evolving discipline that aims at building functional tissues to improve or replace damaged ones. To be successful in such an endeavor, ideally, the engineering of tissues should be based on the principles of developmental biology. Recent progress in developmental biology suggests that the formation of tissues from the composing cells is often guided by physical laws. Here a comprehensive computational-theoretical formalism is presented that is based on experimental input and incorporates biomechanical principles of developmental biology. The formalism is described and it is shown that it correctly reproduces and predicts the quantitative characteristics of the fundamental early developmental process of tissue fusion. Based on this finding, the formalism is then used toward the optimization of the fabrication of tubular multicellular constructs, such as a vascular graft, by bioprinting, a novel tissue engineering technology.
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
Reinventing the High School Government Course: Rigor, Simulations, and Learning from Text
ERIC Educational Resources Information Center
Parker, Walter C.; Lo, Jane C.
2016-01-01
The high school government course is arguably the main site of formal civic education in the country today. This article presents the curriculum that resulted from a multiyear study aimed at improving the course. The pedagogic model, called "Knowledge in Action," centers on a rigorous form of project-based learning where the projects are…
ERIC Educational Resources Information Center
Hodge, Paul; Wright, Sarah; Barraket, Jo; Scott, Marcelle; Melville, Rose; Richardson, Sarah
2011-01-01
Ideas of "how we learn" in formal academic settings have changed markedly in recent decades. The primary position that universities once held on shaping what constitutes learning has come into question from a range of experience-led and situated learning models. Drawing on findings from a study conducted across three Australian universities, the…
When Does Model-Based Control Pay Off?
Kool, Wouter; Cushman, Fiery A; Gershman, Samuel J
2016-08-01
Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to "model-free" and "model-based" strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand.
Díaz-Rodríguez, Miguel; Valera, Angel; Page, Alvaro; Besa, Antonio; Mata, Vicente
2016-05-01
Accurate knowledge of body segment inertia parameters (BSIP) improves the assessment of dynamic analysis based on biomechanical models, which is of paramount importance in fields such as sport activities or impact crash test. Early approaches for BSIP identification rely on the experiments conducted on cadavers or through imaging techniques conducted on living subjects. Recent approaches for BSIP identification rely on inverse dynamic modeling. However, most of the approaches are focused on the entire body, and verification of BSIP for dynamic analysis for distal segment or chain of segments, which has proven to be of significant importance in impact test studies, is rarely established. Previous studies have suggested that BSIP should be obtained by using subject-specific identification techniques. To this end, our paper develops a novel approach for estimating subject-specific BSIP based on static and dynamics identification models (SIM, DIM). We test the validity of SIM and DIM by comparing the results using parameters obtained from a regression model proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230). Both SIM and DIM are developed considering robotics formalism. First, the static model allows the mass and center of gravity (COG) to be estimated. Second, the results from the static model are included in the dynamics equation allowing us to estimate the moment of inertia (MOI). As a case study, we applied the approach to evaluate the dynamics modeling of the head complex. Findings provide some insight into the validity not only of the proposed method but also of the application proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230) for dynamic modeling of body segments.
Mechanisms of Developmental Change in Infant Categorization
ERIC Educational Resources Information Center
Westermann, Gert; Mareschal, Denis
2012-01-01
Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…
Expert judgement and uncertainty quantification for climate change
NASA Astrophysics Data System (ADS)
Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.
2016-05-01
Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Yiying, E-mail: yiyingyan@sjtu.edu.cn; Lü, Zhiguo, E-mail: zglv@sjtu.edu.cn; Zheng, Hang, E-mail: hzheng@sjtu.edu.cn
We present a theoretical formalism for resonance fluorescence radiating from a two-level system (TLS) driven by any periodic driving and coupled to multiple reservoirs. The formalism is derived analytically based on the combination of Floquet theory and Born–Markov master equation. The formalism allows us to calculate the spectrum when the Floquet states and quasienergies are analytically or numerically solved for simple or complicated driving fields. We can systematically explore the spectral features by implementing the present formalism. To exemplify this theory, we apply the unified formalism to comprehensively study a generic model that a harmonically driven TLS is simultaneously coupledmore » to a radiative reservoir and a dephasing reservoir. We demonstrate that the significant features of the fluorescence spectra, the driving-induced asymmetry and the dephasing-induced asymmetry, can be attributed to the violation of detailed balance condition, and explained in terms of the driving-related transition quantities between Floquet-states and their steady populations. In addition, we find the distinguished features of the fluorescence spectra under the biharmonic and multiharmonic driving fields in contrast with that of the harmonic driving case. In the case of the biharmonic driving, we find that the spectra are significantly different from the result of the RWA under the multiple resonance conditions. By the three concrete applications, we illustrate that the present formalism provides a routine tool for comprehensively exploring the fluorescence spectrum of periodically strongly driven TLSs.« less
Formally verifying Ada programs which use real number types
NASA Technical Reports Server (NTRS)
Sutherland, David
1986-01-01
Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.
Systems engineering principles for the design of biomedical signal processing systems.
Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo
2011-06-01
Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Situationally Embodied Curriculum: Relating Formalisms and Contexts
ERIC Educational Resources Information Center
Barab, Sasha; Zuiker, Steve; Warren, Scott; Hickey, Dan; Ingram-Goble, Adam; Kwon, Eun-Ju; Kouper, Inna; Herring, Susan C.
2007-01-01
This study describes an example of design-based research in which we make theoretical improvements in our understanding, in part based on empirical work, and use these to revise our curriculum and, simultaneously, our evolving theory of the relations between contexts and disciplinary formalisms. Prior to this study, we completed a first cycle of…
Does Online Game-Based Learning Work in Formal Education at School? A Case Study of VISOLE
ERIC Educational Resources Information Center
Jong, Morris S. Y.
2015-01-01
VISOLE (Virtual Interactive Student-Oriented Environment) is a teacher-facilitated pedagogical approach to integrating constructivist online game-based learning into formal curriculum teaching in school education. This paper reports a case study on the implementation of VISOLE in secondary Geography education. We compared the pedagogical…
Can Non-Formal Education Keep Working Children in School? A Case Study from Punjab, India
ERIC Educational Resources Information Center
Sud, Pamela
2010-01-01
This paper analyses the effectiveness of non-formal schools for working children in Jalandhar, Punjab, India, in mainstreaming child labourers into the formal education system through incentivised, informal schooling. Using a family fixed effects model and sibling data as an equivalent population comparison group, I find that the non-formal…
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2006-01-01
Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…
Model Driven Engineering with Ontology Technologies
NASA Astrophysics Data System (ADS)
Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva
Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.
Terminal Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, Michail; Meyers, Ronald
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.
Two Formal Gas Models For Multi-Agent Sweeping and Obstacle Avoidance
NASA Technical Reports Server (NTRS)
Kerr, Wesley; Spears, Diana; Spears, William; Thayer, David
2004-01-01
The task addressed here is a dynamic search through a bounded region, while avoiding multiple large obstacles, such as buildings. In the case of limited sensors and communication, maintaining spatial coverage - especially after passing the obstacles - is a challenging problem. Here, we investigate two physics-based approaches to solving this task with multiple simulated mobile robots, one based on artificial forces and the other based on the kinetic theory of gases. The desired behavior is achieved with both methods, and a comparison is made between them. Because both approaches are physics-based, formal assurances about the multi-robot behavior are straightforward, and are included in the paper.
NASA Technical Reports Server (NTRS)
Boulet, C.; Ma, Qiancheng; Tipping, R. H.
2015-01-01
Starting from the refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)], we propose here an extension of line mixing studies to infrared absorptions of linear polyatomic molecules having stretching and bending modes. The present formalism does not neglect the internal degrees of freedom of the perturbing molecules, contrary to the energy corrected sudden (ECS) modeling, and enables one to calculate the whole relaxation matrix starting from the potential energy surface. Meanwhile, similar to the ECS modeling, the present formalism properly accounts for roles played by all the internal angular momenta in the coupling process, including the vibrational angular momentum. The formalism has been applied to the important case of CO2 broadened by N2. Applications to two kinds of vibrational bands (sigma yields sigma and sigma yields pi) have shown that the present results are in good agreement with both experimental data and results derived from the ECS model.
Partition-based discrete-time quantum walks
NASA Astrophysics Data System (ADS)
Konno, Norio; Portugal, Renato; Sato, Iwao; Segawa, Etsuo
2018-04-01
We introduce a family of discrete-time quantum walks, called two-partition model, based on two equivalence-class partitions of the computational basis, which establish the notion of local dynamics. This family encompasses most versions of unitary discrete-time quantum walks driven by two local operators studied in literature, such as the coined model, Szegedy's model, and the 2-tessellable staggered model. We also analyze the connection of those models with the two-step coined model, which is driven by the square of the evolution operator of the standard discrete-time coined walk. We prove formally that the two-step coined model, an extension of Szegedy model for multigraphs, and the two-tessellable staggered model are unitarily equivalent. Then, selecting one specific model among those families is a matter of taste not generality.
Modeling of clover detector in addback mode
NASA Astrophysics Data System (ADS)
Kshetri, R.
2012-07-01
Based on absorption and scattering of gamma-rays, a formalism has been presented for modeling the clover germanium detector in addback mode and to predict its response for high energy γ-rays. In the present formalism, the operation of a bare clover detector could be described in terms of three quantities only. Considering an additional parameter, the formalism could be extended for suppressed clover. Using experimental data on relative single crystal efficiency and addback factor as input, the peak-to-total ratio has been calculated for three energies (Eγ = 3.401, 5.324 and 10.430 MeV) where direct measurement of peak-to-total ratio is impossible due to absence of a radioactive source having single monoenergetic gamma-ray of that energy. The experimental validation and consistency of the formalism have been shown considering data for TIGRESS clover detector. In a recent work (R. Kshetri, JINST 2012 7 P04008), we showed that for a given γ-ray energy, the formalism could be used to predict the peak-to-total ratio as a function of number of detector modules. In the present paper, we have shown that for a given composite detector (clover detector is considered here), the formalism could be used to predict the peak-to-total ratio as a function of γ-ray energy.
Bridging In-school and Out-of-school Learning: Formal, Non-Formal, and Informal Education
NASA Astrophysics Data System (ADS)
Eshach, Haim
2007-04-01
The present paper thoroughly examines how one can effectively bridge in-school and out-of-school learning. The first part discusses the difficulty in defining out-of-school learning. It proposes to distinguish three types of learning: formal, informal, and non-formal. The second part raises the question of whether out-of-school learning should be dealt with in the in-school system, in view of the fact that we experience informal learning anyway as well as considering the disadvantages and difficulties teachers are confronted with when planning and carrying out scientific fieldtrips. The voices of the teachers, the students, and the non-formal institution staff are heard to provide insights into the problem. The third part discusses the cognitive and affective aspects of non-formal learning. The fourth part presents some models explaining scientific fieldtrip learning and based on those models, suggests a novel explanation. The fifth part offers some recommendations of how to bridge in and out-of-school learning. The paper closes with some practical ideas as to how one can bring the theory described in the paper into practice. It is hoped that this paper will provide educators with an insight so that they will be able to fully exploit the great potential that scientific field trips may offer.
Application of growing nested Petri nets for modeling robotic systems operating under risk
NASA Astrophysics Data System (ADS)
Sorokin, E. V.; Senkov, A. V.
2017-10-01
The paper studies the peculiarities of modeling robotic systems engaged in mining. Existing modeling mechanisms are considered, which are based on nested Petri nets, and a new formalism of growing Petri nets is presented that allows modeling robotic systems operating under risk. Modeling is provided both for the regular operation mode and for non-standard modes in which individual elements of the system can perform uncharacteristic functions. The example shows growing Petri nets that are used for modeling extraction of flat coal seams by a robotic system consisting of several different-type autonomous robots.
NASA Astrophysics Data System (ADS)
Cooke-Nieves, Natasha Anika
Science education research has consistently shown that elementary teachers have a low self-efficacy and background knowledge to teach science. When they teach science, there is a lack of field experiences and inquiry-based instruction at the elementary level due to limited resources, both material and pedagogical. This study focused on an analysis of a professional development (PD) model designed by the author known as the Collaborative Diagonal Learning Network (CDLN). The purpose of this study was to examine elementary school teacher participants pedagogical content knowledge related to their experiences in a CDLN model. The CDLN model taught formal and informal instruction using a science coach and an informal educational institution. Another purpose for this research included a theoretical analysis of the CDLN model to see if its design enabled teachers to expand their resource knowledge of available science education materials. The four-month-long study used qualitative data obtained during an in-service professional development program facilitated by a science coach and educators from a large natural history museum. Using case study as the research design, four elementary school teachers were asked to evaluate the effectiveness of their science coach and museum educator workshop sessions. During the duration of this study, semi-structured individual/group interviews and open-ended pre/post PD questionnaires were used. Other data sources included researcher field notes from lesson observations, museum field trips, audio-recorded workshop sessions, email correspondence, and teacher-created artifacts. The data were analyzed using a constructivist grounded theory approach. Themes that emerged included increased self-efficacy; increased pedagogical content knowledge; increased knowledge of museum education resources and access; creation of a professional learning community; and increased knowledge of science notebooking. Implications for formal and informal professional development in elementary science reform are offered. It is suggested that researchers investigate collaborative coaching through the lenses of organizational learning network theory, and develop professional learning communities with formal and informal educators; and that professional developers in city school systems and informal science institutions work in concert to produce more effective elementary teachers who not only love science but love teaching it.
VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data
Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel
2014-01-01
This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198
Quantization of Spontaneously Broken Gauge Theory Based on the Bft-Bfv Formalism
NASA Astrophysics Data System (ADS)
Kim, Yong-Wan; Park, Young-Jai
We quantize the spontaneously broken Abelian U(1) Higgs model by using the improved BFT and BFV formalisms. We construct the BFT physical fields and obtain the firstclass observables including the Hamiltonian in terms of these fields. We also explicitly show that there are exact form invariances between the second-class and first-class quantities. Then, according to the BFV formalism, we derive the corresponding Lagrangian having U(1) gauge symmetry. We also discuss at the classical level how one easily gets the first-class Lagrangian from the symmetry-broken second-class Lagrangian.
Memory sparing, fast scattering formalism for rigorous diffraction modeling
NASA Astrophysics Data System (ADS)
Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.
2017-07-01
The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.
When Does Model-Based Control Pay Off?
2016-01-01
Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to “model-free” and “model-based” strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Adolescent thinking ála Piaget: The formal stage.
Dulit, E
1972-12-01
Two of the formal-stage experiments of Piaget and Inhelder, selected largely for their closeness to the concepts defining the stage, were replicated with groups of average and gifted adolescents. This report describes the relevant Piagetian concepts (formal stage, concrete stage) in context, gives the methods and findings of this study, and concludes with a section discussing implications and making some reformulations which generally support but significantly qualify some of the central themes of the Piaget-Inhelder work. Fully developed formal-stage thinking emerges as far from commonplace among normal or average adolescents (by marked contrast with the impression created by the Piaget-Inhelder text, which chooses to report no middle or older adolescents who function at less than fully formal levels). In this respect, the formal stage differs appreciably from the earlier Piagetian stages, and early adolescence emerges as the age for which a "single path" model of cognitive development becomes seriously inadequate and a more complex model becomes essential. Formal-stage thinking seems best conceptualized, like most other aspects of psychological maturity, as a potentiality only partially attained by most and fully attained only by some.
NASA Astrophysics Data System (ADS)
Henstridge, Martin C.; Batchelor-McAuley, Christopher; Gusmão, Rui; Compton, Richard G.
2011-11-01
Two simple models of electrode surface inhomogeneity based on Marcus-Hush theory are considered; a distribution in formal potentials and a distribution in electron tunnelling distances. Cyclic voltammetry simulated using these models is compared with that simulated using Marcus-Hush theory for a flat, uniform and homogeneous electrode surface, with the two models of surface inhomogeneity yielding broadened peaks with decreased peak-currents. An edge-plane pyrolytic graphite electrode is covalently modified with ferrocene via 'click' chemistry and the resulting voltammetry compared with each of the three previously considered models. The distribution of formal potentials is seen to fit the experimental data most closely.
Data accuracy assessment using enterprise architecture
NASA Astrophysics Data System (ADS)
Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias
2011-02-01
Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Formal verification of software-based medical devices considering medical guidelines.
Daw, Zamira; Cleaveland, Rance; Vetter, Marcus
2014-01-01
Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.
ERIC Educational Resources Information Center
Ward, Ted W.; Herzog, William A., Jr.
This document is part of a series dealing with nonformal education. Introductory information is included in document SO 008 058. The focus of this report is on the learning effectiveness of nonformal education. Chapter 1 compares effective learning in a formal and nonformal environment. Chapter 2 develops a systems model for designers of learning…
Information visualisation based on graph models
NASA Astrophysics Data System (ADS)
Kasyanov, V. N.; Kasyanova, E. V.
2013-05-01
Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
Reasoning with Vectors: A Continuous Model for Fast Robust Inference.
Widdows, Dominic; Cohen, Trevor
2015-10-01
This paper describes the use of continuous vector space models for reasoning with a formal knowledge base. The practical significance of these models is that they support fast, approximate but robust inference and hypothesis generation, which is complementary to the slow, exact, but sometimes brittle behavior of more traditional deduction engines such as theorem provers. The paper explains the way logical connectives can be used in semantic vector models, and summarizes the development of Predication-based Semantic Indexing, which involves the use of Vector Symbolic Architectures to represent the concepts and relationships from a knowledge base of subject-predicate-object triples. Experiments show that the use of continuous models for formal reasoning is not only possible, but already demonstrably effective for some recognized informatics tasks, and showing promise in other traditional problem areas. Examples described in this paper include: predicting new uses for existing drugs in biomedical informatics; removing unwanted meanings from search results in information retrieval and concept navigation; type-inference from attributes; comparing words based on their orthography; and representing tabular data, including modelling numerical values. The algorithms and techniques described in this paper are all publicly released and freely available in the Semantic Vectors open-source software package.
Reasoning with Vectors: A Continuous Model for Fast Robust Inference
Widdows, Dominic; Cohen, Trevor
2015-01-01
This paper describes the use of continuous vector space models for reasoning with a formal knowledge base. The practical significance of these models is that they support fast, approximate but robust inference and hypothesis generation, which is complementary to the slow, exact, but sometimes brittle behavior of more traditional deduction engines such as theorem provers. The paper explains the way logical connectives can be used in semantic vector models, and summarizes the development of Predication-based Semantic Indexing, which involves the use of Vector Symbolic Architectures to represent the concepts and relationships from a knowledge base of subject-predicate-object triples. Experiments show that the use of continuous models for formal reasoning is not only possible, but already demonstrably effective for some recognized informatics tasks, and showing promise in other traditional problem areas. Examples described in this paper include: predicting new uses for existing drugs in biomedical informatics; removing unwanted meanings from search results in information retrieval and concept navigation; type-inference from attributes; comparing words based on their orthography; and representing tabular data, including modelling numerical values. The algorithms and techniques described in this paper are all publicly released and freely available in the Semantic Vectors open-source software package.1 PMID:26582967
A discriminatory function for prediction of protein-DNA interactions based on alpha shape modeling.
Zhou, Weiqiang; Yan, Hong
2010-10-15
Protein-DNA interaction has significant importance in many biological processes. However, the underlying principle of the molecular recognition process is still largely unknown. As more high-resolution 3D structures of protein-DNA complex are becoming available, the surface characteristics of the complex become an important research topic. In our work, we apply an alpha shape model to represent the surface structure of the protein-DNA complex and developed an interface-atom curvature-dependent conditional probability discriminatory function for the prediction of protein-DNA interaction. The interface-atom curvature-dependent formalism captures atomic interaction details better than the atomic distance-based method. The proposed method provides good performance in discriminating the native structures from the docking decoy sets, and outperforms the distance-dependent formalism in terms of the z-score. Computer experiment results show that the curvature-dependent formalism with the optimal parameters can achieve a native z-score of -8.17 in discriminating the native structure from the highest surface-complementarity scored decoy set and a native z-score of -7.38 in discriminating the native structure from the lowest RMSD decoy set. The interface-atom curvature-dependent formalism can also be used to predict apo version of DNA-binding proteins. These results suggest that the interface-atom curvature-dependent formalism has a good prediction capability for protein-DNA interactions. The code and data sets are available for download on http://www.hy8.com/bioinformatics.htm kenandzhou@hotmail.com.
Advani, Aneel; Goldstein, Mary; Shahar, Yuval; Musen, Mark A
2003-01-01
Automated quality assessment of clinician actions and patient outcomes is a central problem in guideline- or standards-based medical care. In this paper we describe a model representation and algorithm for deriving structured quality indicators and auditing protocols from formalized specifications of guidelines used in decision support systems. We apply the model and algorithm to the assessment of physician concordance with a guideline knowledge model for hypertension used in a decision-support system. The properties of our solution include the ability to derive automatically context-specific and case-mix-adjusted quality indicators that can model global or local levels of detail about the guideline parameterized by defining the reliability of each indicator or element of the guideline.
Two Project-Based Strategies in an Interdisciplinary Mathematical Modeling in Biology Course
ERIC Educational Resources Information Center
Ludwig, Patrice; Tongen, Anthony; Walton, Brian
2018-01-01
James Madison University faculty team-teach an interdisciplinary mathematical modeling course for mathematics and biology students. We have used two different project-based approaches to emphasize the mathematical concepts taught in class, while also exposing students to new areas of mathematics not formally covered in class. The first method…
Petri net-based modelling of human-automation conflicts in aviation.
Pizziol, Sergio; Tessier, Catherine; Dehais, Frédéric
2014-01-01
Analyses of aviation safety reports reveal that human-machine conflicts induced by poor automation design are remarkable precursors of accidents. A review of different crew-automation conflicting scenarios shows that they have a common denominator: the autopilot behaviour interferes with the pilot's goal regarding the flight guidance via 'hidden' mode transitions. Considering both the human operator and the machine (i.e. the autopilot or the decision functions) as agents, we propose a Petri net model of those conflicting interactions, which allows them to be detected as deadlocks in the Petri net. In order to test our Petri net model, we designed an autoflight system that was formally analysed to detect conflicting situations. We identified three conflicting situations that were integrated in an experimental scenario in a flight simulator with 10 general aviation pilots. The results showed that the conflicts that we had a-priori identified as critical had impacted the pilots' performance. Indeed, the first conflict remained unnoticed by eight participants and led to a potential collision with another aircraft. The second conflict was detected by all the participants but three of them did not manage the situation correctly. The last conflict was also detected by all the participants but provoked typical automation surprise situation as only one declared that he had understood the autopilot behaviour. These behavioural results are discussed in terms of workload and number of fired 'hidden' transitions. Eventually, this study reveals that both formal and experimental approaches are complementary to identify and assess the criticality of human-automation conflicts. Practitioner Summary: We propose a Petri net model of human-automation conflicts. An experiment was conducted with general aviation pilots performing a scenario involving three conflicting situations to test the soundness of our formal approach. This study reveals that both formal and experimental approaches are complementary to identify and assess the criticality conflicts.
Object-oriented biomedical system modelling--the language.
Hakman, M; Groth, T
1999-11-01
The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.
State-and-transition model archetypes: a global taxonomy of rangeland change
USDA-ARS?s Scientific Manuscript database
State and transition models (STMs) synthesize science-based and local knowledge to formally represent the dynamics of rangeland and other ecosystems. Mental models or concepts of ecosystem dynamics implicitly underlie all management decisions in rangelands and thus how people influence rangeland sus...
ERIC Educational Resources Information Center
Savi Çakar, Firdevs; Tagay, Özlem
2017-01-01
This research is a descriptive study based on the testing of a structural model developed by considering the effects of perceived social support and subjective well-being on adolescents' risky behaviors, and the possible mediating role of self-esteem. Participants consisted of 676 high school students attending formal education institutions,…
A Formal Methods Approach to the Analysis of Mode Confusion
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).
Modeling Cyber Conflicts Using an Extended Petri Net Formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakrzewska, Anita N; Ferragut, Erik M
2011-01-01
When threatened by automated attacks, critical systems that require human-controlled responses have difficulty making optimal responses and adapting protections in real- time and may therefore be overwhelmed. Consequently, experts have called for the development of automatic real-time reaction capabilities. However, a technical gap exists in the modeling and analysis of cyber conflicts to automatically understand the repercussions of responses. There is a need for modeling cyber assets that accounts for concurrent behavior, incomplete information, and payoff functions. Furthermore, we address this need by extending the Petri net formalism to allow real-time cyber conflicts to be modeled in a way thatmore » is expressive and concise. This formalism includes transitions controlled by players as well as firing rates attached to transitions. This allows us to model both player actions and factors that are beyond the control of players in real-time. We show that our formalism is able to represent situational aware- ness, concurrent actions, incomplete information and objective functions. These factors make it well-suited to modeling cyber conflicts in a way that allows for useful analysis. MITRE has compiled the Common Attack Pattern Enumera- tion and Classification (CAPEC), an extensive list of cyber attacks at various levels of abstraction. CAPEC includes factors such as attack prerequisites, possible countermeasures, and attack goals. These elements are vital to understanding cyber attacks and to generating the corresponding real-time responses. We demonstrate that the formalism can be used to extract precise models of cyber attacks from CAPEC. Several case studies show that our Petri net formalism is more expressive than other models, such as attack graphs, for modeling cyber conflicts and that it is amenable to exploring cyber strategies.« less
ERIC Educational Resources Information Center
Parkinson, Eric F.
2004-01-01
Construction kits have played a significant part in nurturing the growth and development of the minds and manipulation-based skills of children (and adults) in formal and non-formal education settings. These kits have origins rooted in the representation of the built world and now have a diversity of form and function, including technical versions…
Stochastic foundations in nonlinear density-regulation growth
NASA Astrophysics Data System (ADS)
Méndez, Vicenç; Assaf, Michael; Horsthemke, Werner; Campos, Daniel
2017-08-01
In this work we construct individual-based models that give rise to the generalized logistic model at the mean-field deterministic level and that allow us to interpret the parameters of these models in terms of individual interactions. We also study the effect of internal fluctuations on the long-time dynamics for the different models that have been widely used in the literature, such as the theta-logistic and Savageau models. In particular, we determine the conditions for population extinction and calculate the mean time to extinction. If the population does not become extinct, we obtain analytical expressions for the population abundance distribution. Our theoretical results are based on WKB theory and the probability generating function formalism and are verified by numerical simulations.
Social and Environmental Factors Related to Boys' and Girls' Park-Based Physical Activity.
Bocarro, Jason N; Floyd, Myron F; Smith, William R; Edwards, Michael B; Schultz, Courtney L; Baran, Perver; Moore, Robin A; Cosco, Nilda; Suau, Luis J
2015-06-18
Parks provide opportunities for physical activity for children. This study examined sex differences in correlates of park-based physical activity because differences may indicate that a standard environmental intervention to increase activity among children may not equally benefit boys and girls. The System for Observation Play and Recreation in Communities was used to measure physical activity among 2,712 children and adolescents in 20 neighborhood parks in Durham, North Carolina, in 2007. Sedentary activity, walking, vigorous park activity, and energy expenditure were the primary outcome variables. Hierarchical logit regression models of physical activity were estimated separately for boys and girls. Type of activity area and presence of other active children were positively associated with boys' and girls' physical activity, and presence of a parent was negatively associated. A significant interaction involving number of recreation facilities in combination with formal activities was positively associated with girls' activity. A significant interaction involving formal park activity and young boys (aged 0-5 y) was negatively associated with park-based physical activity. Activity area and social correlates of park-based physical activity were similar for boys and girls; findings for formal park programming, age, and number of facilities were mixed. Results show that girls' physical activity was more strongly affected by social effects (eg, presence of other active children) whereas boys' physical activity was more strongly influenced by the availability of park facilities. These results can inform park planning and design. Additional studies are necessary to clarify sex differences in correlates of park-based physical activity.
Social and Environmental Factors Related to Boys’ and Girls’ Park-Based Physical Activity
Floyd, Myron F.; Smith, William R.; Edwards, Michael B.; Schultz, Courtney L.; Baran, Perver; Moore, Robin A.; Cosco, Nilda; Suau, Luis J.
2015-01-01
Introduction Parks provide opportunities for physical activity for children. This study examined sex differences in correlates of park-based physical activity because differences may indicate that a standard environmental intervention to increase activity among children may not equally benefit boys and girls. Methods The System for Observation Play and Recreation in Communities was used to measure physical activity among 2,712 children and adolescents in 20 neighborhood parks in Durham, North Carolina, in 2007. Sedentary activity, walking, vigorous park activity, and energy expenditure were the primary outcome variables. Hierarchical logit regression models of physical activity were estimated separately for boys and girls. Results Type of activity area and presence of other active children were positively associated with boys’ and girls’ physical activity, and presence of a parent was negatively associated. A significant interaction involving number of recreation facilities in combination with formal activities was positively associated with girls’ activity. A significant interaction involving formal park activity and young boys (aged 0–5 y) was negatively associated with park-based physical activity. Conclusion Activity area and social correlates of park-based physical activity were similar for boys and girls; findings for formal park programming, age, and number of facilities were mixed. Results show that girls’ physical activity was more strongly affected by social effects (eg, presence of other active children) whereas boys’ physical activity was more strongly influenced by the availability of park facilities. These results can inform park planning and design. Additional studies are necessary to clarify sex differences in correlates of park-based physical activity. PMID:26086610
Unpacking buyer-seller differences in valuation from experience: A cognitive modeling approach.
Pachur, Thorsten; Scheibehenne, Benjamin
2017-12-01
People often indicate a higher price for an object when they own it (i.e., as sellers) than when they do not (i.e., as buyers)-a phenomenon known as the endowment effect. We develop a cognitive modeling approach to formalize, disentangle, and compare alternative psychological accounts (e.g., loss aversion, loss attention, strategic misrepresentation) of such buyer-seller differences in pricing decisions of monetary lotteries. To also be able to test possible buyer-seller differences in memory and learning, we study pricing decisions from experience, obtained with the sampling paradigm, where people learn about a lottery's payoff distribution from sequential sampling. We first formalize different accounts as models within three computational frameworks (reinforcement learning, instance-based learning theory, and cumulative prospect theory), and then fit the models to empirical selling and buying prices. In Study 1 (a reanalysis of published data with hypothetical decisions), models assuming buyer-seller differences in response bias (implementing a strategic-misrepresentation account) performed best; models assuming buyer-seller differences in choice sensitivity or memory (implementing a loss-attention account) generally fared worst. In a new experiment involving incentivized decisions (Study 2), models assuming buyer-seller differences in both outcome sensitivity (as proposed by a loss-aversion account) and response bias performed best. In both Study 1 and 2, the models implemented in cumulative prospect theory performed best. Model recovery studies validated our cognitive modeling approach, showing that the models can be distinguished rather well. In summary, our analysis supports a loss-aversion account of the endowment effect, but also reveals a substantial contribution of simple response bias.
Why formal learning theory matters for cognitive science.
Fulop, Sean; Chater, Nick
2013-01-01
This article reviews a number of different areas in the foundations of formal learning theory. After outlining the general framework for formal models of learning, the Bayesian approach to learning is summarized. This leads to a discussion of Solomonoff's Universal Prior Distribution for Bayesian learning. Gold's model of identification in the limit is also outlined. We next discuss a number of aspects of learning theory raised in contributed papers, related to both computational and representational complexity. The article concludes with a description of how semi-supervised learning can be applied to the study of cognitive learning models. Throughout this overview, the specific points raised by our contributing authors are connected to the models and methods under review. Copyright © 2013 Cognitive Science Society, Inc.
Zerlaut, Yann; Chemla, Sandrine; Chavane, Frederic; Destexhe, Alain
2018-02-01
Voltage-sensitive dye imaging (VSDi) has revealed fundamental properties of neocortical processing at macroscopic scales. Since for each pixel VSDi signals report the average membrane potential over hundreds of neurons, it seems natural to use a mean-field formalism to model such signals. Here, we present a mean-field model of networks of Adaptive Exponential (AdEx) integrate-and-fire neurons, with conductance-based synaptic interactions. We study a network of regular-spiking (RS) excitatory neurons and fast-spiking (FS) inhibitory neurons. We use a Master Equation formalism, together with a semi-analytic approach to the transfer function of AdEx neurons to describe the average dynamics of the coupled populations. We compare the predictions of this mean-field model to simulated networks of RS-FS cells, first at the level of the spontaneous activity of the network, which is well predicted by the analytical description. Second, we investigate the response of the network to time-varying external input, and show that the mean-field model predicts the response time course of the population. Finally, to model VSDi signals, we consider a one-dimensional ring model made of interconnected RS-FS mean-field units. We found that this model can reproduce the spatio-temporal patterns seen in VSDi of awake monkey visual cortex as a response to local and transient visual stimuli. Conversely, we show that the model allows one to infer physiological parameters from the experimentally-recorded spatio-temporal patterns.
ERIC Educational Resources Information Center
Richter, Dirk; Kunter, Mareike; Klusmann, Uta; Ludtke, Oliver; Baumert, Jurgen
2011-01-01
This study examined teachers' uptake of formal and informal learning opportunities across the career cycle. Analyses were based on data from 1939 German secondary teachers in 198 schools. Results showed that formal learning opportunities (in-service training) were used most frequently by mid-career teachers, whereas informal learning opportunities…
Archetypal dynamics, emergent situations, and the reality game.
Sulis, William
2010-07-01
The classical approach to the modeling of reality is founded upon its objectification. Although successful dealing with inanimate matter, objectification has proven to be much less successful elsewhere, sometimes to the point of paradox. This paper discusses an approach to the modeling of reality based upon the concept of process as formulated within the framework of archetypal dynamics. Reality is conceptualized as an intermingling of information-transducing systems, together with the semantic frames that effectively describe and ascribe meaning to each system, along with particular formal representations of same which constitute the archetypes. Archetypal dynamics is the study of the relationships between systems, frames and their representations and the flow of information among these different entities. In this paper a specific formal representation of archetypal dynamics using tapestries is given, and a dynamics is founded upon this representation in the form of a combinatorial game called a reality game. Some simple examples are presented.
NASA Technical Reports Server (NTRS)
Prescod-Weinstein, Chanda; Afshordi, Niayesh
2011-01-01
Structure formation provides a strong test of any cosmic acceleration model because a successful dark energy model must not inhibit or overpredict the development of observed large-scale structures. Traditional approaches to studies of structure formation in the presence of dark energy or a modified gravity implement a modified Press-Schechter formalism, which relates the linear overdensities to the abundance of dark matter haloes at the same time. We critically examine the universality of the Press-Schechter formalism for different cosmologies, and show that the halo abundance is best correlated with spherical linear overdensity at 94% of collapse (or observation) time. We then extend this argument to ellipsoidal collapse (which decreases the fractional time of best correlation for small haloes), and show that our results agree with deviations from modified Press-Schechter formalism seen in simulated mass functions. This provides a novel universal prescription to measure linear density evolution, based on current and future observations of cluster (or dark matter) halo mass function. In particular, even observations of cluster abundance in a single epoch will constrain the entire history of linear growth of cosmological of perturbations.
Rigorous simulations of a helical core fiber by the use of transformation optics formalism.
Napiorkowski, Maciej; Urbanczyk, Waclaw
2014-09-22
We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.
Anatomy and histology as socially networked learning environments: some preliminary findings.
Hafferty, Frederic W; Castellani, Brian; Hafferty, Philip K; Pawlina, Wojciech
2013-09-01
An exploratory study to better understand the "networked" life of the medical school as a learning environment. In a recent academic year, the authors gathered data during two six-week blocks of a sequential histology and anatomy course at a U.S. medical college. An eight-item questionnaire captured different dimensions of student interactions. The student cohort/network was 48 first-year medical students. Using social network analysis (SNA), the authors focused on (1) the initial structure and the evolution of informal class networks over time, (2) how informal class networks compare to formal in-class small-group assignments in influencing student information gathering, and (3) how peer assignment of professionalism role model status is shaped more by informal than formal ties. In examining these latter two issues, the authors explored not only how formal group assignment persisted over time but also how it functioned to prevent the tendency for groupings based on gender or ethnicity. The study revealed an evolving dynamic between the formal small-group learning structure of the course blocks and the emergence of informal student networks. For example, whereas formal group membership did influence in-class questions and did prevent formation of groups of like gender and ethnicity, outside-class questions and professionalism were influenced more by informal group ties where gender and, to a much lesser extent, ethnicity influence student information gathering. The richness of these preliminary findings suggests that SNA may be a useful tool in examining an array of medical student learning encounters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
Formal analysis and evaluation of the back-off procedure in IEEE802.11P VANET
NASA Astrophysics Data System (ADS)
Jin, Li; Zhang, Guoan; Zhu, Xiaojun
2017-07-01
The back-off procedure is one of the media access control technologies in 802.11P communication protocol. It plays an important role in avoiding message collisions and allocating channel resources. Formal methods are effective approaches for studying the performances of communication systems. In this paper, we establish a discrete time model for the back-off procedure. We use Markov Decision Processes (MDPs) to model the non-deterministic and probabilistic behaviors of the procedure, and use the probabilistic computation tree logic (PCTL) language to express different properties, which ensure that the discrete time model performs their basic functionality. Based on the model and PCTL specifications, we study the effect of contention window length on the number of senders in the neighborhood of given receivers, and that on the station’s expected cost required by the back-off procedure to successfully send packets. The variation of the window length may increase or decrease the maximum probability of correct transmissions within a time contention unit. We propose to use PRISM model checker to describe our proposed back-off procedure for IEEE802.11P protocol in vehicle network, and define different probability properties formulas to automatically verify the model and derive numerical results. The obtained results are helpful for justifying the values of the time contention unit.
NASA Astrophysics Data System (ADS)
Rutherford, Lori B.
The purpose of this study was to (1) identify alternative conceptions concerning astronomy in groups of formal and informal educators, (2) discover the origins of some of these conceptions and (3) explore how practicing teachers planned to address the need for conceptual change in their students. In response to the first question, a number of alternative conceptions were identified in formal educators, with more for teachers of prekindergarten through third grade than fourth through twelfth grade teachers, and very few alternative conceptions in the informal educators group. In regards to the second research question, a number of origins were indicated: logic, books, elementary school, high school, astronomy classes, self-study and observation. In response to the third question, various practicing teachers used computer programs and modeling in order to address some of the alternative conceptions they noticed in their students. These findings were supported by the literature and theoretical frameworks on which the study was based. The study addressed gaps in the literature concerning alternative conceptions and how they related to Ohio's Academic Content Standards along with nineteen other states. This study also addressed the need for a closer examination of informal educators and how they compare to formal educators in terms of having alternative conceptions. And finally, implications and recommendations were made for practicing educators, materials for practicing educators, teacher education, informal and formal education partnerships, standards modification, research methodology and areas of future research.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Verifying Hybrid Systems Modeled as Timed Automata: A Case Study
1997-03-01
Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools
Formal analysis of imprecise system requirements with Event-B.
Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan
2016-01-01
Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.
Determining informative priors for cognitive models.
Lee, Michael D; Vanpaemel, Wolf
2018-02-01
The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. In the Bayesian approach to implementing, testing, and using cognitive models, assumptions can influence both the likelihood function of the model, usually corresponding to assumptions about psychological processes, and the prior distribution over model parameters, usually corresponding to assumptions about the psychological variables that influence those processes. The specification of the prior is unique to the Bayesian context, but often raises concerns that lead to the use of vague or non-informative priors in cognitive modeling. Sometimes the concerns stem from philosophical objections, but more often practical difficulties with how priors should be determined are the stumbling block. We survey several sources of information that can help to specify priors for cognitive models, discuss some of the methods by which this information can be formalized in a prior distribution, and identify a number of benefits of including informative priors in cognitive modeling. Our discussion is based on three illustrative cognitive models, involving memory retention, categorization, and decision making.
MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system
NASA Astrophysics Data System (ADS)
Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg
2005-01-01
We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.
NASA Astrophysics Data System (ADS)
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.
2015-11-01
28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal
Ma, Q; Boulet, C
2016-06-14
The Robert-Bonamy formalism has been commonly used to calculate half-widths and shifts of spectral lines for decades. This formalism is based on several approximations. Among them, two have not been fully addressed: the isolated line approximation and the neglect of coupling between the translational and internal motions. Recently, we have shown that the isolated line approximation is not necessary in developing semi-classical line shape theories. Based on this progress, we have been able to develop a new formalism that enables not only to reduce uncertainties on calculated half-widths and shifts, but also to model line mixing effects on spectra starting from the knowledge of the intermolecular potential. In our previous studies, the new formalism had been applied to linear and asymmetric-top molecules. In the present study, the method has been extended to symmetric-top molecules with inversion symmetry. As expected, the inversion splitting induces a complete failure of the isolated line approximation. We have calculated the complex relaxation matrices of self-broadened NH3. The half-widths and shifts in the ν1 and the pure rotational bands are reported in the present paper. When compared with measurements, the calculated half-widths match the experimental data very well, since the inapplicable isolated line approximation has been removed. With respect to the shifts, only qualitative results are obtained and discussed. Calculated off-diagonal elements of the relaxation matrix and a comparison with the observed line mixing effects are reported in the companion paper (Paper II).
NASA Technical Reports Server (NTRS)
Ma, Q.; Boulet, C.
2016-01-01
The Robert-Bonamy formalism has been commonly used to calculate half-widths and shifts of spectral lines for decades. This formalism is based on several approximations. Among them, two have not been fully addressed: the isolated line approximation and the neglect of coupling between the translational and internal motions. Recently, we have shown that the isolated line approximation is not necessary in developing semi-classical line shape theories. Based on this progress, we have been able to develop a new formalism that enables not only to reduce uncertainties on calculated half-widths and shifts, but also to model line mixing effects on spectra starting from the knowledge of the intermolecular potential. In our previous studies, the new formalism had been applied to linear and asymmetric-top molecules. In the present study, the method has been extended to symmetric-top molecules with inversion symmetry. As expected, the inversion splitting induces a complete failure of the isolated line approximation. We have calculated the complex relaxation matrices of selfbroadened NH3. The half-widths and shifts in the ?1 and the pure rotational bands are reported in the present paper. When compared with measurements, the calculated half-widths match the experimental data very well, since the inapplicable isolated line approximation has been removed. With respect to the shifts, only qualitative results are obtained and discussed. Calculated off-diagonal elements of the relaxation matrix and a comparison with the observed line mixing effects are reported in the companion paper (Paper II).
2003-09-01
resolution M&S concept for integrating heterogeneous M&S into the hierarchy has existed since the early 1980s [DH92a, DH92b]. 25...groups [PAD78]. The need for credible M&S grew in the Nation’s private and public sectors. By 1980 , information from computer-based simulations...formal) identified in the [GMS+96 and RPG00]. We noted that systemic issues identi- fied in by reports, studies, and assessments the early 1980s
Petri Nets as Modeling Tool for Emergent Agents
NASA Technical Reports Server (NTRS)
Bergman, Marto
2004-01-01
Emergent agents, those agents whose local interactions can cause unexpected global results, require a method of modeling that is both dynamic and structured Petri Nets, a modeling tool developed for dynamic discrete event system of mainly functional agents, provide this, and have the benefit of being an established tool. We present here the details of the modeling method here and discuss how to implement its use for modeling agent-based systems. Petri Nets have been used extensively in the modeling of functional agents, those agents who have defined purposes and whose actions should result in a know outcome. However, emergent agents, those agents who have a defined structure but whose interaction causes outcomes that are unpredictable, have not yet found a modeling style that suits them. A problem with formally modeling emergent agents that any formal modeling style usually expects to show the results of a problem and the results of problems studied using emergent agents are not apparent from the initial construction. However, the study of emergent agents still requires a method to analyze the agents themselves, and have sensible conversation about the differences and similarities between types of emergent agents. We attempt to correct this problem by applying Petri Nets to the characterization of emergent agents. In doing so, the emergent properties of these agents can be highlighted, and conversation about the nature and compatibility of the differing methods of agent creation can begin.
A provably-secure ECC-based authentication scheme for wireless sensor networks.
Nam, Junghyun; Kim, Moonseong; Paik, Juryon; Lee, Youngsook; Won, Dongho
2014-11-06
A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme) is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000). Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC), and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure) schemes.
A Provably-Secure ECC-Based Authentication Scheme for Wireless Sensor Networks
Nam, Junghyun; Kim, Moonseong; Paik, Juryon; Lee, Youngsook; Won, Dongho
2014-01-01
A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme) is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000). Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC), and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure) schemes. PMID:25384009
Towards Symbolic Model Checking for Multi-Agent Systems via OBDDs
NASA Technical Reports Server (NTRS)
Raimondi, Franco; Lomunscio, Alessio
2004-01-01
We present an algorithm for model checking temporal-epistemic properties of multi-agent systems, expressed in the formalism of interpreted systems. We first introduce a technique for the translation of interpreted systems into boolean formulae, and then present a model-checking algorithm based on this translation. The algorithm is based on OBDD's, as they offer a compact and efficient representation for boolean formulae.
Formal Methods Case Studies for DO-333
NASA Technical Reports Server (NTRS)
Cofer, Darren; Miller, Steven P.
2014-01-01
RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.
Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd
2014-01-01
Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.
Decker, Johannes H; Otto, A Ross; Daw, Nathaniel D; Hartley, Catherine A
2016-06-01
Theoretical models distinguish two decision-making strategies that have been formalized in reinforcement-learning theory. A model-based strategy leverages a cognitive model of potential actions and their consequences to make goal-directed choices, whereas a model-free strategy evaluates actions based solely on their reward history. Research in adults has begun to elucidate the psychological mechanisms and neural substrates underlying these learning processes and factors that influence their relative recruitment. However, the developmental trajectory of these evaluative strategies has not been well characterized. In this study, children, adolescents, and adults performed a sequential reinforcement-learning task that enabled estimation of model-based and model-free contributions to choice. Whereas a model-free strategy was apparent in choice behavior across all age groups, a model-based strategy was absent in children, became evident in adolescents, and strengthened in adults. These results suggest that recruitment of model-based valuation systems represents a critical cognitive component underlying the gradual maturation of goal-directed behavior. © The Author(s) 2016.
2011-01-01
Background The few studies that have attempted to estimate the future cost of caring for people with dementia in Australia are typically based on total prevalence and the cost per patient over the average duration of illness. However, costs associated with dementia care also vary according to the length of the disease, severity of symptoms and type of care provided. This study aimed to determine more accurately the future costs of dementia management by taking these factors into consideration. Methods The current study estimated the prevalence of dementia in Australia (2010-2040). Data from a variety of sources was recalculated to distribute this prevalence according to the location (home/institution), care requirements (informal/formal), and dementia severity. The cost of care was attributed to redistributed prevalences and used in prediction of future costs of dementia. Results Our computer modeling indicates that the ratio between the prevalence of people with mild/moderate/severe dementia will change over the three decades from 2010 to 2040 from 50/30/20 to 44/32/24. Taking into account the severity of symptoms, location of care and cost of care per hour, the current study estimates that the informal cost of care in 2010 is AU$3.2 billion and formal care at AU$5.0 billion per annum. By 2040 informal care is estimated to cost AU$11.6 billion and formal care $AU16.7 billion per annum. Interventions to slow disease progression will result in relative savings of 5% (AU$1.5 billion) per annum and interventions to delay disease onset will result in relative savings of 14% (AU$4 billion) of the cost per annum. With no intervention, the projected combined annual cost of formal and informal care for a person with dementia in 2040 will be around AU$38,000 (in 2010 dollars). An intervention to delay progression by 2 years will see this reduced to AU$35,000. Conclusions These findings highlight the need to account for more than total prevalence when estimating the costs of dementia care. While the absolute values of cost of care estimates are subject to the validity and reliability of currently available data, dynamic systems modeling allows for future trends to be estimated. PMID:21988908
Scoping Planning Agents With Shared Models
NASA Technical Reports Server (NTRS)
Bedrax-Weiss, Tania; Frank, Jeremy D.; Jonsson, Ari K.; McGann, Conor
2003-01-01
In this paper we provide a formal framework to define the scope of planning agents based on a single declarative model. Having multiple agents sharing a single model provides numerous advantages that lead to reduced development costs and increase reliability of the system. We formally define planning in terms of extensions of an initial partial plan, and a set of flaws that make the plan unacceptable. A Flaw Filter (FF) allows us to identify those flaws relevant to an agent. Flaw filters motivate the Plan Identification Function (PIF), which specifies when an agent is is ready hand control to another agent for further work. PIFs define a set of plan extensions that can be generated from a model and a plan request. FFs and PIFs can be used to define the scope of agents without changing the model. We describe an implementation of PIFsand FFswithin the context of EUROPA, a constraint-based planning architecture, and show how it can be used to easily design many different agents.
Peeters, José M; Pot, Anne Margriet; de Lange, Jacomine; Spreeuwenberg, Peter M; Francke, Anneke L
2016-03-09
In the Netherlands, various organisational models of dementia case management exist. In this study the following four models are distinguished, based on differences in the availability of the service and in the case management function: Model 1: the case management service is available from first dementia symptoms + is always a separate specialist function; Model 2: the case management service is only available after a formal dementia diagnosis + is always a separate specialist function; Model 3: the case management service is available from first dementia symptoms + is often a combined function; Model 4: the case management service is only available after a formal dementia diagnosis + is often a combined function. The objectives of this study are to give insight into whether satisfaction with dementia case management and the development of caregiver burden depend on the organisational model. A survey was carried out in regional dementia care networks in the Netherlands among 554 informal carers for people with dementia at the start of case management (response of 85 %), and one year later. Descriptive statistics and multilevel models were used to analyse the data. The satisfaction with the case manager was high in general (an average of 8.0 within a possible range of 1 to 10), although the caregiver burden did not decrease in the first year after starting with case management. No differences were found between the four organisational models regarding the development of caregiver burden. However, statistically significant differences (p < 0.05) were found regarding satisfaction: informal carers in the organisational model where case management is only available after formal diagnosis of dementia and is often a combined function had on average the lowest satisfaction scores. Nevertheless, the satisfaction of informal carers within all organisational models was high (ranging from 7.51 to 8.40 within a range of 1 to 10). Organisational features of case management seem to make little or no difference to the development in caregiver burden and the satisfaction of informal carers. Future research is needed to explore whether the individual characteristics of the case managers themselves are associated with case management outcomes.
Towards a formal semantics for Ada 9X
NASA Technical Reports Server (NTRS)
Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark
1995-01-01
The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.
Non-Formal Vocational Education in Uganda: Practical Empowerment through a Workable Alternative
ERIC Educational Resources Information Center
Blaak, Marit; Openjuru, George L.; Zeelen, Jacques
2013-01-01
This article reflects on the potential of non-formal vocational education in Uganda to improve the quality of life of those excluded from formal education. Based on an exploration of humanizing development theorists Sen, Freire and Nyerere, together with two case studies, practical empowerment is described as a desirable outcome of education for…
The kinematic dynamo problem, part I: analytical treatment with the Bullard-Gellman formalism
NASA Astrophysics Data System (ADS)
Glane, Sebastian; Reich, Felix A.; Müller, Wolfgang H.
2018-03-01
This paper is dedicated to the description of kinematic dynamo action in a sphere and its analytical treatment with the uc(Bullard)-uc(Gellman) formalism. One goal of dynamo theory is to answer the question: Can magnetic fields of stellar objects be generated or sustained due to (fluid) motion in the interior? uc(Bullard) and uc(Gellman) were among the first to study this question, leading the way for many subsequent studies, cf. Bullard (Philos Trans R Soc A 247(928):213-278, 1954). In their publication the differential equations resulting from a toroidal-poloidal decomposition of the velocity and magnetic field are stated without an in-depth discussion of the employed methods and computation steps. This study derives the necessary formalism in a compact and concise manner by using an operator-based approach. The focus lies on the mathematical steps and necessary properties of the considered formalism. Prior to that a derivation of the induction equation is presented based on rational continuum electrodynamics. As an example of the formalism the decay of two magnetic fields is analyzed.
Inductive reasoning about causally transmitted properties.
Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B
2008-11-01
Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
A 3D generic inverse dynamic method using wrench notation and quaternion algebra.
Dumas, R; Aissaoui, R; de Guise, J A
2004-06-01
In the literature, conventional 3D inverse dynamic models are limited in three aspects related to inverse dynamic notation, body segment parameters and kinematic formalism. First, conventional notation yields separate computations of the forces and moments with successive coordinate system transformations. Secondly, the way conventional body segment parameters are defined is based on the assumption that the inertia tensor is principal and the centre of mass is located between the proximal and distal ends. Thirdly, the conventional kinematic formalism uses Euler or Cardanic angles that are sequence-dependent and suffer from singularities. In order to overcome these limitations, this paper presents a new generic method for inverse dynamics. This generic method is based on wrench notation for inverse dynamics, a general definition of body segment parameters and quaternion algebra for the kinematic formalism.
Patch models and their applications to multivehicle command and control.
Rao, Venkatesh G; D'Andrea, Raffaello
2007-06-01
We introduce patch models, a computational modeling formalism for multivehicle combat domains, based on spatiotemporal abstraction methods developed in the computer science community. The framework yields models that are expressive enough to accommodate nontrivial controlled vehicle dynamics while being within the representational capabilities of common artificial intelligence techniques used in the construction of autonomous systems. The framework allows several key design requirements of next-generation network-centric command and control systems, such as maintenance of shared situation awareness, to be achieved. Major features include support for multiple situation models at each decision node and rapid mission plan adaptation. We describe the formal specification of patch models and our prototype implementation, i.e., Patchworks. The capabilities of patch models are validated through a combat mission simulation in Patchworks, which involves two defending teams protecting a camp from an enemy attacking team.
A Model-Driven Approach to e-Course Management
ERIC Educational Resources Information Center
Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana
2018-01-01
This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…
Extension of specification language for soundness and completeness of service workflow
NASA Astrophysics Data System (ADS)
Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn
2018-05-01
A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.
An object-oriented approach for harmonization of multimedia markup languages
NASA Astrophysics Data System (ADS)
Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay
2003-12-01
An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.
Estimating neural response functions from fMRI
Kumar, Sukhbinder; Penny, William
2014-01-01
This paper proposes a methodology for estimating Neural Response Functions (NRFs) from fMRI data. These NRFs describe non-linear relationships between experimental stimuli and neuronal population responses. The method is based on a two-stage model comprising an NRF and a Hemodynamic Response Function (HRF) that are simultaneously fitted to fMRI data using a Bayesian optimization algorithm. This algorithm also produces a model evidence score, providing a formal model comparison method for evaluating alternative NRFs. The HRF is characterized using previously established “Balloon” and BOLD signal models. We illustrate the method with two example applications based on fMRI studies of the auditory system. In the first, we estimate the time constants of repetition suppression and facilitation, and in the second we estimate the parameters of population receptive fields in a tonotopic mapping study. PMID:24847246
Modeling and Verification of Dependable Electronic Power System Architecture
NASA Astrophysics Data System (ADS)
Yuan, Ling; Fan, Ping; Zhang, Xiao-fang
The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.
NASA Astrophysics Data System (ADS)
Piao, Chunhui; Han, Xufang; Wu, Harris
2010-08-01
We provide a formal definition of an e-commerce transaction network. Agent-based modelling is used to simulate e-commerce transaction networks. For real-world analysis, we studied the open application programming interfaces (APIs) from eBay and Taobao e-commerce websites and captured real transaction data. Pajek is used to visualise the agent relationships in the transaction network. We derived one-mode networks from the transaction network and analysed them using degree and betweenness centrality. Integrating multi-agent modelling, open APIs and social network analysis, we propose a new way to study large-scale e-commerce systems.
Advani, Aneel; Goldstein, Mary; Shahar, Yuval; Musen, Mark A.
2003-01-01
Automated quality assessment of clinician actions and patient outcomes is a central problem in guideline- or standards-based medical care. In this paper we describe a model representation and algorithm for deriving structured quality indicators and auditing protocols from formalized specifications of guidelines used in decision support systems. We apply the model and algorithm to the assessment of physician concordance with a guideline knowledge model for hypertension used in a decision-support system. The properties of our solution include the ability to derive automatically (1) context-specific and (2) case-mix-adjusted quality indicators that (3) can model global or local levels of detail about the guideline (4) parameterized by defining the reliability of each indicator or element of the guideline. PMID:14728124
Merging first principle structure studies and few-body reaction formalism
NASA Astrophysics Data System (ADS)
Crespo, R.; Cravo, E.; Arriaga, A.; Wiringa, R.; Deltuva, A.; Diego, R.
2018-02-01
Calculations for nucleon knockout from a 7Li beam due to the collision with a proton target at 400 MeV/u are shown based on ab initio Quantum Monte Carlo (QMC) and conventional shell-model nuclear structure approaches to describe the relative motion between the knockout particle and the heavy fragment of the projectile. Structure effects on the total cross section are shown.
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Pereira, José N; Silva, Porfírio; Lima, Pedro U; Martinoli, Alcherio
2014-01-01
The work described is part of a long term program of introducing institutional robotics, a novel framework for the coordination of robot teams that stems from institutional economics concepts. Under the framework, institutions are cumulative sets of persistent artificial modifications made to the environment or to the internal mechanisms of a subset of agents, thought to be functional for the collective order. In this article we introduce a formal model of institutional controllers based on Petri nets. We define executable Petri nets-an extension of Petri nets that takes into account robot actions and sensing-to design, program, and execute institutional controllers. We use a generalized stochastic Petri net view of the robot team controlled by the institutional controllers to model and analyze the stochastic performance of the resulting distributed robotic system. The ability of our formalism to replicate results obtained using other approaches is assessed through realistic simulations of up to 40 e-puck robots. In particular, we model a robot swarm and its institutional controller with the goal of maintaining wireless connectivity, and successfully compare our model predictions and simulation results with previously reported results, obtained by using finite state automaton models and controllers.
2013-01-01
Background To formulate sustainable long-term care policies, it is critical first to understand the relationship between informal care and formal care expenditure. The aim of this paper is to examine to what extent informal care reduces public expenditure on elderly care. Methods Data from a geriatric rehabilitation program conducted in Finland (Age Study, n = 732) were used to estimate the annual public care expenditure on elderly care. We first constructed hierarchical multilevel regression models to determine the factors associated with elderly care expenditure. Second, we calculated the adjusted mean costs of care in four care patterns: 1) informal care only for elderly living alone; 2) informal care only from a co-resident family member; 3) a combination of formal and informal care; and 4) formal care only. We included functional independence and health-related quality of life (15D score) measures into our models. This method standardizes the care needs of a heterogeneous subject group and enabled us to compare expenditure among various care categories even when differences were observed in the subjects’ physical health. Results Elder care that consisted of formal care only had the highest expenditure at 25,300 Euros annually. The combination of formal and informal care had an annual expenditure of 22,300 Euros. If a person received mainly informal care from a co-resident family member, then the annual expenditure was only 4,900 Euros and just 6,000 Euros for a person living alone and receiving informal care. Conclusions Our analysis of a frail elderly Finnish population shows that the availability of informal care considerably reduces public care expenditure. Therefore, informal care should be taken into account when formulating policies for long-term care. The process whereby families choose to provide care for their elderly relatives has a significant impact on long-term care expenditure. PMID:23947622
A systemic approach for modeling biological evolution using Parallel DEVS.
Heredia, Daniel; Sanz, Victorino; Urquia, Alfonso; Sandín, Máximo
2015-08-01
A new model for studying the evolution of living organisms is proposed in this manuscript. The proposed model is based on a non-neodarwinian systemic approach. The model is focused on considering several controversies and open discussions about modern evolutionary biology. Additionally, a simplification of the proposed model, named EvoDEVS, has been mathematically described using the Parallel DEVS formalism and implemented as a computer program using the DEVSLib Modelica library. EvoDEVS serves as an experimental platform to study different conditions and scenarios by means of computer simulations. Two preliminary case studies are presented to illustrate the behavior of the model and validate its results. EvoDEVS is freely available at http://www.euclides.dia.uned.es. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Formal Methods of V&V of Partial Specifications: An Experience Report
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective
NASA Astrophysics Data System (ADS)
Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.
2017-12-01
The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.
Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web
NASA Astrophysics Data System (ADS)
Huang, Hong; Gong, Jianya
2008-12-01
GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.
Ontology driven modeling for the knowledge of genetic susceptibility to disease.
Lin, Yu; Sakamoto, Norihiro
2009-05-12
For the machine helped exploring the relationships between genetic factors and complex diseases, a well-structured conceptual framework of the background knowledge is needed. However, because of the complexity of determining a genetic susceptibility factor, there is no formalization for the knowledge of genetic susceptibility to disease, which makes the interoperability between systems impossible. Thus, the ontology modeling language OWL was used for formalization in this paper. After introducing the Semantic Web and OWL language propagated by W3C, we applied text mining technology combined with competency questions to specify the classes of the ontology. Then, an N-ary pattern was adopted to describe the relationships among these defined classes. Based on the former work of OGSF-DM (Ontology of Genetic Susceptibility Factors to Diabetes Mellitus), we formalized the definition of "Genetic Susceptibility", "Genetic Susceptibility Factor" and other classes by using OWL-DL modeling language; and a reasoner automatically performed the classification of the class "Genetic Susceptibility Factor". The ontology driven modeling is used for formalization the knowledge of genetic susceptibility to complex diseases. More importantly, when a class has been completely formalized in an ontology, the OWL reasoning can automatically compute the classification of the class, in our case, the class of "Genetic Susceptibility Factors". With more types of genetic susceptibility factors obtained from the laboratory research, our ontologies always needs to be refined, and many new classes must be taken into account to harmonize with the ontologies. Using the ontologies to develop the semantic web needs to be applied in the future.
Formalized landscape models for surveying and modelling tasks
NASA Astrophysics Data System (ADS)
Löwner, Marc-Oliver
2010-05-01
We present a formalization of main geomorphic landscape models, mainly the concept of slopes, to clarify the needs and potentials of surveying technologies and modelling approaches. Using the Unified Modelling Language (UML) it is implemented as a exchangeable Geography Markup Language (GML3) -based application schema and therefore supports shared measurement campaigns. Today, knowledge in Geomorphology is given synoptically in textbooks in a more or less lyrical way. This knowledge is hard to implement for the use of modelling algorithms or data storage and sharing questions. On the other hand physical based numerical modelling and high resolution surveying technologies enable us to investigate case scenarios within small scales. Bringing together such approaches and organizing our data in an appropriate way will need the formalization of the concepts and knowledge that is archived in the science of geomorphology. The main problem of comparing research results in geomorphology but is that the objects under investigation are composed of 3-dimensional geometries that change in time due to processes of material fluxes, e. g. soil erosion or mass movements. They have internal properties, e. g. soil texture or bulk density, that determine the effectiveness of these processes but are under change as well. The presented application schema is available on the Internet and therefore a first step to enable researchers to share information using an OGC's Web feature service. In this vein comparing modelling results of landscape evolution with results of other scientist's observations is possible. Compared to prevalent data concepts the model presented makes it possible to store information about landforms, their geometry and the characteristics in more detail. It allows to represent the 3D-geometry, the set of material properties and the genesis of a landform by associating processes to a geoobject. Thus, time slices of a geomorphic system can be represented as well as scenarios of landscape modelling. Commercial GI-software is not adapted to the needs of the science of geomorphology. Therefore the development of an application model i. e. a formal description of semantics is imperative to partake in technologies like Web Feature Services supporting interoperable data transfer.
Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo
2015-08-01
Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Abstracting event-based control models for high autonomy systems
NASA Technical Reports Server (NTRS)
Luh, Cheng-Jye; Zeigler, Bernard P.
1993-01-01
A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Beware the tail that wags the dog: informal and formal models in biology
Gunawardena, Jeremy
2014-01-01
Informal models have always been used in biology to guide thinking and devise experiments. In recent years, formal mathematical models have also been widely introduced. It is sometimes suggested that formal models are inherently superior to informal ones and that biology should develop along the lines of physics or economics by replacing the latter with the former. Here I suggest to the contrary that progress in biology requires a better integration of the formal with the informal. PMID:25368417
NASA Astrophysics Data System (ADS)
Cook, B.; Anchukaitis, K. J.
2017-12-01
Comparative analyses of paleoclimate reconstructions and climate model simulations can provide valuable insights into past and future climate events. Conducting meaningful and quantitative comparisons, however, can be difficult for a variety of reasons. Here, we use tree-ring based hydroclimate reconstructions to discuss some best practices for paleoclimate-model comparisons, highlighting recent studies that have successfully used this approach. These analyses have improved our understanding of the Medieval-era megadroughts, ocean forcing of large scale drought patterns, and even climate change contributions to future drought risk. Additional work is needed, however, to better reconcile and formalize uncertainties across observed, modeled, and reconstructed variables. In this regard, process based forward models of proxy-systems will likely be a critical tool moving forward.
TOPICS IN THEORY OF GENERALIZED PARTON DISTRIBUTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radyushkin, Anatoly V.
Several topics in the theory of generalized parton distributions (GPDs) are reviewed. First, we give a brief overview of the basics of the theory of generalized parton distributions and their relationship with simpler phenomenological functions, viz. form factors, parton densities and distribution amplitudes. Then, we discuss recent developments in building models for GPDs that are based on the formalism of double distributions (DDs). A special attention is given to a careful analysis of the singularity structure of DDs. The DD formalism is applied to construction of a model GPDs with a singular Regge behavior. Within the developed DD-based approach, wemore » discuss the structure of GPD sum rules. It is shown that separation of DDs into the so-called ``plus'' part and the $D$-term part may be treated as a renormalization procedure for the GPD sum rules. This approach is compared with an alternative prescription based on analytic regularization.« less
ERIC Educational Resources Information Center
van Compernolle, Rémi A.
2017-01-01
Drawing on data collected via a web-based survey, the study investigates the relationship between preferences for (in)formal language and attitudes toward linguistic variation among a large group of monolingual and multilingual adults (n = 379). Also explored are the links between preferences for (in)formal language and several secondary…
Ren, Yue; Li, Jinhai; Aswani Kumar, Cherukuri; Liu, Wenqi
2014-01-01
Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: "if conditions 1,2,…, and m hold, then decisions hold." In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency.
Ren, Yue; Aswani Kumar, Cherukuri; Liu, Wenqi
2014-01-01
Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: “if conditions 1,2,…, and m hold, then decisions hold.” In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency. PMID:25165744
ERIC Educational Resources Information Center
Towne, Forrest S.
2009-01-01
Current domestic and international comparative studies of student achievement in science are demonstrating that the U.S. needs to improve science education if it wants to remain competitive in the global economy. One of the causes of the poor performance of U.S. science education is the lack of students who have developed the formal thinking…
ERIC Educational Resources Information Center
Brown, Eleanor J.
2015-01-01
This paper presents comparative case studies of non-formal development education by non-profit organisations in two European countries. The study aimed to explore the extent to which such activities provide opportunities for transformative learning. The research was qualitative and began with interviews with educators across 14 organisations in…
A complex fermionic tensor model in d dimensions
NASA Astrophysics Data System (ADS)
Prakash, Shiroman; Sinha, Ritam
2018-02-01
In this note, we study a melonic tensor model in d dimensions based on three-index Dirac fermions with a four-fermion interaction. Summing the melonic diagrams at strong coupling allows one to define a formal large- N saddle point in arbitrary d and calculate the spectrum of scalar bilinear singlet operators. For d = 2 - ɛ the theory is an infrared fixed point, which we find has a purely real spectrum that we determine numerically for arbitrary d < 2, and analytically as a power series in ɛ. The theory appears to be weakly interacting when ɛ is small, suggesting that fermionic tensor models in 1-dimension can be studied in an ɛ expansion. For d > 2, the spectrum can still be calculated using the saddle point equations, which may define a formal large- N ultraviolet fixed point analogous to the Gross-Neveu model in d > 2. For 2 < d < 6, we find that the spectrum contains at least one complex scalar eigenvalue (similar to the complex eigenvalue present in the bosonic tensor model recently studied by Giombi, Klebanov and Tarnopolsky) which indicates that the theory is unstable. We also find that the fixed point is weakly-interacting when d = 6 (or more generally d = 4 n + 2) and has a real spectrum for 6 < d < 6 .14 which we present as a power series in ɛ in 6 + ɛ dimensions.
ERIC Educational Resources Information Center
Grey, Simon; Grey, David; Gordon, Neil; Purdy, Jon
2017-01-01
This paper offers an approach to designing game-based learning experiences inspired by the Mechanics-Dynamics-Aesthetics (MDA) model (Hunicke et al., 2004) and the elemental tetrad model (Schell, 2008) for game design. A case for game based learning as an active and social learning experience is presented including arguments from both teachers and…
Learning the Norm of Internality: NetNorm, a Connectionist Model
ERIC Educational Resources Information Center
Thierry, Bollon; Adeline, Paignon; Pascal, Pansu
2011-01-01
The objective of the present article is to show that connectionist simulations can be used to model some of the socio-cognitive processes underlying the learning of the norm of internality. For our simulations, we developed a connectionist model which we called NetNorm (based on Dual-Network formalism). This model is capable of simulating the…
Large deviations in the presence of cooperativity and slow dynamics
NASA Astrophysics Data System (ADS)
Whitelam, Stephen
2018-06-01
We study simple models of intermittency, involving switching between two states, within the dynamical large-deviation formalism. Singularities appear in the formalism when switching is cooperative or when its basic time scale diverges. In the first case the unbiased trajectory distribution undergoes a symmetry breaking, leading to a change in shape of the large-deviation rate function for a particular dynamical observable. In the second case the symmetry of the unbiased trajectory distribution remains unbroken. Comparison of these models suggests that singularities of the dynamical large-deviation formalism can signal the dynamical equivalent of an equilibrium phase transition but do not necessarily do so.
ADM Analysis of gravity models within the framework of bimetric variational formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golovnev, Alexey; Karčiauskas, Mindaugas; Nyrhinen, Hannu J., E-mail: agolovnev@yandex.ru, E-mail: mindaugas.karciauskas@helsinki.fi, E-mail: hannu.nyrhinen@helsinki.fi
2015-05-01
Bimetric variational formalism was recently employed to construct novel bimetric gravity models. In these models an affine connection is generated by an additional tensor field which is independent of the physical metric. In this work we demonstrate how the ADM decomposition can be applied to study such models and provide some technical intermediate details. Using ADM decomposition we are able to prove that a linear model is unstable as has previously been indicated by perturbative analysis. Moreover, we show that it is also very difficult if not impossible to construct a non-linear model which is ghost-free within the framework ofmore » bimetric variational formalism. However, we demonstrate that viable models are possible along similar lines of thought. To this end, we consider a set up in which the affine connection is a variation of the Levi-Civita one. As a proof of principle we construct a gravity model with a massless scalar field obtained this way.« less
Knowledge Acquisition of Generic Queries for Information Retrieval
Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.
2002-01-01
Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.
Safety Verification of the Small Aircraft Transportation System Concept of Operations
NASA Technical Reports Server (NTRS)
Carreno, Victor; Munoz, Cesar
2005-01-01
A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.
Analytic Modeling of Insurgencies
2014-08-01
Counterinsurgency, Situational Awareness, Civilians, Lanchester 1. Introduction Combat modeling is one of the oldest areas of operations research, dating...Army. The ground-breaking work of Lanchester in 1916 [1] marks the beginning of formal models of conflicts, where mathematical formulas and, later...Warfare model [3], which is a Lanchester - based mathematical model (see more details about this model later on), and McCormick’s Magic Diamond model [4
Development of dynamic Bayesian models for web application test management
NASA Astrophysics Data System (ADS)
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil
2016-03-15
Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. The annotation ontology for rule-based models can be found at http://purl.org/rbm/rbmo The krdf tool and associated executable examples are available at http://purl.org/rbm/rbmo/krdf anil.wipat@newcastle.ac.uk or vdanos@inf.ed.ac.uk. © The Author 2015. Published by Oxford University Press.
On the Connection Between One-and Two-Equation Models of Turbulence
NASA Technical Reports Server (NTRS)
Menter, F. R.; Rai, Man Mohan (Technical Monitor)
1994-01-01
A formalism will be presented that allows the transformation of two-equation eddy viscosity turbulence models into one-equation models. The transformation is based on an assumption that is widely accepted over a large range of boundary layer flows and that has been shown to actually improve predictions when incorporated into two-equation models of turbulence. Based on that assumption, a new one-equation turbulence model will be derived. The new model will be tested in great detail against a previously introduced one-equation model and against its parent two-equation model.
Consumer experience of formal crisis-response services and preferred methods of crisis intervention.
Boscarato, Kara; Lee, Stuart; Kroschel, Jon; Hollander, Yitzchak; Brennan, Alice; Warren, Narelle
2014-08-01
The manner in which people with mental illness are supported in a crisis is crucial to their recovery. The current study explored mental health consumers' experiences with formal crisis services (i.e. police and crisis assessment and treatment (CAT) teams), preferred crisis supports, and opinions of four collaborative interagency response models. Eleven consumers completed one-on-one, semistructured interviews. The results revealed that the perceived quality of previous formal crisis interventions varied greatly. Most participants preferred family members or friends to intervene. However, where a formal response was required, general practitioners and mental health case managers were preferred; no participant wanted a police response, and only one indicated a preference for CAT team assistance. Most participants welcomed collaborative crisis interventions. Of four collaborative interagency response models currently being trialled internationally, participants most strongly supported the Ride-Along Model, which enables a police officer and a mental health clinician to jointly respond to distressed consumers in the community. The findings highlight the potential for an interagency response model to deliver a crisis response aligned with consumers' preferences. © 2014 Australian College of Mental Health Nurses Inc.
NASA Astrophysics Data System (ADS)
Hug, J. William
1998-09-01
This research presents a teaching model designed to enable learners to construct a highly developed ecological perspective and sense of place. The contextually-based research process draws upon scientific and indigenous knowledge from multiple data sources including: autobiographical experiences, environmental literature, science and environmental education research, historical approaches to environmental education, and phenomenological accounts from research participants. Data were analyzed using the theoretical frameworks of qualitative research, hermeneutic phenomenology, heuristics, and constructivism. The resulting model synthesizes and incorporates key educational philosophies and practices from: nature study, resident outdoor education, organized camping, conservation education, environmental education, earth education, outdoor recreation, sustainability, bio-regionalism, deep ecology, ecological and environmental literacy, science and technology in society, and adventure/challenge/experiential education. The model's four components--environmental knowledge, practicing responsible environmental behaviors, community-focused involvement, and direct experience in outdoor settings--contribute in a synergistic way to the development of ecological perspective and a sense of place. The model was honed through experiential use in an environmental science methods course for elementary and secondary prospective science teachers. The instructor/researcher employed individualized instruction, community-based learning, service learning, and the modeling of reflective teaching principles in pursuit of the model's goals. The resulting pedagogical knowledge extends the model's usefulness to such formal and non-formal educational contexts as: elementary/secondary classrooms, nature centers, museums, youth groups, and community organizations. This research has implications for the fields of education, geography, recreation/leisure studies, science teaching, and environmental education. Several aspects of this work make it novel. First, autobiographical and literature-based stories anchor the representations of ecological perspective and sense of place. Second, the dissertation text visually differentiates between story narrative, researcher narrative, and meta-narrative in order to convey the positionality of the researcher's distinct voices. Finally, icons are used throughout the text to visually link the model's multi-dimensional intersections. Oh, and by the way, I hope you read it.
NASA Astrophysics Data System (ADS)
Pyt'ev, Yu. P.
2018-01-01
mathematical formalism for subjective modeling, based on modelling of uncertainty, reflecting unreliability of subjective information and fuzziness that is common for its content. The model of subjective judgments on values of an unknown parameter x ∈ X of the model M( x) of a research object is defined by the researcher-modeler as a space1 ( X, p( X), P{I^{\\bar x}}, Be{l^{\\bar x}}) with plausibility P{I^{\\bar x}} and believability Be{l^{\\bar x}} measures, where x is an uncertain element taking values in X that models researcher—modeler's uncertain propositions about an unknown x ∈ X, measures P{I^{\\bar x}}, Be{l^{\\bar x}} model modalities of a researcher-modeler's subjective judgments on the validity of each x ∈ X: the value of P{I^{\\bar x}}(\\tilde x = x) determines how relatively plausible, in his opinion, the equality (\\tilde x = x) is, while the value of Be{l^{\\bar x}}(\\tilde x = x) determines how the inequality (\\tilde x = x) should be relatively believed in. Versions of plausibility Pl and believability Bel measures and pl- and bel-integrals that inherit some traits of probabilities, psychophysics and take into account interests of researcher-modeler groups are considered. It is shown that the mathematical formalism of subjective modeling, unlike "standard" mathematical modeling, •enables a researcher-modeler to model both precise formalized knowledge and non-formalized unreliable knowledge, from complete ignorance to precise knowledge of the model of a research object, to calculate relative plausibilities and believabilities of any features of a research object that are specified by its subjective model M(\\tilde x), and if the data on observations of a research object is available, then it: •enables him to estimate the adequacy of subjective model to the research objective, to correct it by combining subjective ideas and the observation data after testing their consistency, and, finally, to empirically recover the model of a research object.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle
The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less
A formal language for the specification and verification of synchronous and asynchronous circuits
NASA Technical Reports Server (NTRS)
Russinoff, David M.
1993-01-01
A formal hardware description language for the intended application of verifiable asynchronous communication is described. The language is developed within the logical framework of the Nqthm system of Boyer and Moore and is based on the event-driven behavioral model of VHDL, including the basic VHDL signal propagation mechanisms, the notion of simulation deltas, and the VHDL simulation cycle. A core subset of the language corresponds closely with a subset of VHDL and is adequate for the realistic gate-level modeling of both combinational and sequential circuits. Various extensions to this subset provide means for convenient expression of behavioral circuit specifications.
Primate cognition and the bared-teeth display: a reevaluation of the concept of formal dominance.
Maestripieri, D
1996-12-01
The cognitive implications of the phenomenon of dominance in nonhuman primates and other animals have been poorly addressed. This article shows that the concept of formal dominance currently used in the primatological literature is inherently dependent on the assumption that primates are capable of attribution of knowledge to other individuals. Although this assumption has never been made explicitly, without it the concept of formal dominance is virtually indistinguishable from other, more traditional views of dominance. Recent studies have failed to demonstrate attribution of knowledge in nonhuman primates, thus questioning the validity of formal dominance. The concept of formal dominance was originally proposed to account for a discrepancy between dominance hierarchies based on aggressive behavior and dominance hierarchies based on submissive signals. This article shows that such a discrepancy can be accounted for without invoking complex cognitive processes such as attribution of knowledge.
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
Theoretical Studies of Spectroscopic Line Mixing in Remote Sensing Applications
NASA Astrophysics Data System (ADS)
Ma, Q.
2015-12-01
The phenomenon of collisional transfer of intensity due to line mixing has an increasing importance for atmospheric monitoring. From a theoretical point of view, all relevant information about the collisional processes is contained in the relaxation matrix where the diagonal elements give half-widths and shifts, and the off-diagonal elements correspond to line interferences. For simple systems such as those consisting of diatom-atom or diatom-diatom, accurate fully quantum calculations based on interaction potentials are feasible. However, fully quantum calculations become unrealistic for more complex systems. On the other hand, the semi-classical Robert-Bonamy (RB) formalism, which has been widely used to calculate half-widths and shifts for decades, fails in calculating the off-diagonal matrix elements. As a result, in order to simulate atmospheric spectra where the effects from line mixing are important, semi-empirical fitting or scaling laws such as the ECS and IOS models are commonly used. Recently, while scrutinizing the development of the RB formalism, we have found that these authors applied the isolated line approximation in their evaluating matrix elements of the Liouville scattering operator given in exponential form. Since the criterion of this assumption is so stringent, it is not valid for many systems of interest in atmospheric applications. Furthermore, it is this assumption that blocks the possibility to calculate the whole relaxation matrix at all. By eliminating this unjustified application, and accurately evaluating matrix elements of the exponential operators, we have developed a more capable formalism. With this new formalism, we are now able not only to reduce uncertainties for calculated half-widths and shifts, but also to remove a once insurmountable obstacle to calculate the whole relaxation matrix. This implies that we can address the line mixing with the semi-classical theory based on interaction potentials between molecular absorber and molecular perturber. We have applied this formalism to address the line mixing for Raman and infrared spectra of molecules such as N2, C2H2, CO2, NH3, and H2O. By carrying out rigorous calculations, our calculated relaxation matrices are in good agreement with both experimental data and results derived from the ECS model.
NASA Astrophysics Data System (ADS)
Yoshimoto, Yuta; Li, Zhen; Kinefuchi, Ikuya; Karniadakis, George Em
2017-12-01
We propose a new coarse-grained (CG) molecular simulation technique based on the Mori-Zwanzig (MZ) formalism along with the iterative Boltzmann inversion (IBI). Non-Markovian dissipative particle dynamics (NMDPD) taking into account memory effects is derived in a pairwise interaction form from the MZ-guided generalized Langevin equation. It is based on the introduction of auxiliary variables that allow for the replacement of a non-Markovian equation with a Markovian one in a higher dimensional space. We demonstrate that the NMDPD model exploiting MZ-guided memory kernels can successfully reproduce the dynamic properties such as the mean square displacement and velocity autocorrelation function of a Lennard-Jones system, as long as the memory kernels are appropriately evaluated based on the Volterra integral equation using the force-velocity and velocity-velocity correlations. Furthermore, we find that the IBI correction of a pair CG potential significantly improves the representation of static properties characterized by a radial distribution function and pressure, while it has little influence on the dynamic processes. Our findings suggest that combining the advantages of both the MZ formalism and IBI leads to an accurate representation of both the static and dynamic properties of microscopic systems that exhibit non-Markovian behavior.
Process Definition and Modeling Guidebook. Version 01.00.02
1992-12-01
material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Laurence; Yurkovich, James T.; Lloyd, Colton J.
Integrating omics data to refine or make context-specific models is an active field of constraint-based modeling. Proteomics now cover over 95% of the Escherichia coli proteome by mass. Genome-scale models of Metabolism and macromolecular Expression (ME) compute proteome allocation linked to metabolism and fitness. Using proteomics data, we formulated allocation constraints for key proteome sectors in the ME model. The resulting calibrated model effectively computed the “generalist” (wild-type) E. coli proteome and phenotype across diverse growth environments. Across 15 growth conditions, prediction errors for growth rate and metabolic fluxes were 69% and 14% lower, respectively. The sector-constrained ME model thusmore » represents a generalist ME model reflecting both growth rate maximization and “hedging” against uncertain environments and stresses, as indicated by significant enrichment of these sectors for the general stress response sigma factor σS. Finally, the sector constraints represent a general formalism for integrating omics data from any experimental condition into constraint-based ME models. The constraints can be fine-grained (individual proteins) or coarse-grained (functionally-related protein groups) as demonstrated here. Furthermore, this flexible formalism provides an accessible approach for narrowing the gap between the complexity captured by omics data and governing principles of proteome allocation described by systems-level models.« less
Haplotype-Based Genome-Wide Prediction Models Exploit Local Epistatic Interactions Among Markers
Jiang, Yong; Schmidt, Renate H.; Reif, Jochen C.
2018-01-01
Genome-wide prediction approaches represent versatile tools for the analysis and prediction of complex traits. Mostly they rely on marker-based information, but scenarios have been reported in which models capitalizing on closely-linked markers that were combined into haplotypes outperformed marker-based models. Detailed comparisons were undertaken to reveal under which circumstances haplotype-based genome-wide prediction models are superior to marker-based models. Specifically, it was of interest to analyze whether and how haplotype-based models may take local epistatic effects between markers into account. Assuming that populations consisted of fully homozygous individuals, a marker-based model in which local epistatic effects inside haplotype blocks were exploited (LEGBLUP) was linearly transformable into a haplotype-based model (HGBLUP). This theoretical derivation formally revealed that haplotype-based genome-wide prediction models capitalize on local epistatic effects among markers. Simulation studies corroborated this finding. Due to its computational efficiency the HGBLUP model promises to be an interesting tool for studies in which ultra-high-density SNP data sets are studied. Applying the HGBLUP model to empirical data sets revealed higher prediction accuracies than for marker-based models for both traits studied using a mouse panel. In contrast, only a small subset of the traits analyzed in crop populations showed such a benefit. Cases in which higher prediction accuracies are observed for HGBLUP than for marker-based models are expected to be of immediate relevance for breeders, due to the tight linkage a beneficial haplotype will be preserved for many generations. In this respect the inheritance of local epistatic effects very much resembles the one of additive effects. PMID:29549092
Haplotype-Based Genome-Wide Prediction Models Exploit Local Epistatic Interactions Among Markers.
Jiang, Yong; Schmidt, Renate H; Reif, Jochen C
2018-05-04
Genome-wide prediction approaches represent versatile tools for the analysis and prediction of complex traits. Mostly they rely on marker-based information, but scenarios have been reported in which models capitalizing on closely-linked markers that were combined into haplotypes outperformed marker-based models. Detailed comparisons were undertaken to reveal under which circumstances haplotype-based genome-wide prediction models are superior to marker-based models. Specifically, it was of interest to analyze whether and how haplotype-based models may take local epistatic effects between markers into account. Assuming that populations consisted of fully homozygous individuals, a marker-based model in which local epistatic effects inside haplotype blocks were exploited (LEGBLUP) was linearly transformable into a haplotype-based model (HGBLUP). This theoretical derivation formally revealed that haplotype-based genome-wide prediction models capitalize on local epistatic effects among markers. Simulation studies corroborated this finding. Due to its computational efficiency the HGBLUP model promises to be an interesting tool for studies in which ultra-high-density SNP data sets are studied. Applying the HGBLUP model to empirical data sets revealed higher prediction accuracies than for marker-based models for both traits studied using a mouse panel. In contrast, only a small subset of the traits analyzed in crop populations showed such a benefit. Cases in which higher prediction accuracies are observed for HGBLUP than for marker-based models are expected to be of immediate relevance for breeders, due to the tight linkage a beneficial haplotype will be preserved for many generations. In this respect the inheritance of local epistatic effects very much resembles the one of additive effects. Copyright © 2018 Jiang et al.
Learning in non-formal education: Is it "youthful" for youth in action?
NASA Astrophysics Data System (ADS)
Norqvist, Lars; Leffler, Eva
2017-04-01
This article offers insights into the practices of a non-formal education programme for youth provided by the European Union (EU). It takes a qualitative approach and is based on a case study of the European Voluntary Service (EVS). Data were collected during individual and focus group interviews with learners (the EVS volunteers), decision takers and trainers, with the aim of deriving an understanding of learning in non-formal education. The research questions concerned learning, the recognition of learning and perspectives of usefulness. The study also examined the Youthpass documentation tool as a key to understanding the recognition of learning and to determine whether the learning was useful for learners (the volunteers). The findings and analysis offer several interpretations of learning, and the recognition of learning, which take place in non-formal education. The findings also revealed that it is complicated to divide learning into formal and non- formal categories; instead, non-formal education is useful for individual learners when both formal and non-formal educational contexts are integrated. As a consequence, the division of formal and non-formal (and possibly even informal) learning creates a gap which works against the development of flexible and interconnected education with ubiquitous learning and mobility within and across formal and non-formal education. This development is not in the best interests of learners, especially when seeking useful learning and education for youth (what the authors term "youthful" for youth in action).
Satellite image-based maps: Scientific inference or pretty pictures?
Ronald E. McRoberts
2011-01-01
The scientific method has been characterized as having two distinct components, Discovery and Justification. Discovery emphasizes ideas and creativity, focuses on conceiving hypotheses and constructing models, and is generally regarded as lacking a formal logic. Justification begins with the hypotheses and models and ends with a...
1992-12-01
describing how. 5. EDDA . EDDA is an attempt to add mathematical formalism to SADT. Because it is based on SADT, it cannot easily represent any other...design methodology. EDDA has two forms: G- EDDA , the standard graphical version of SADT, and S- EDDA , a textual language that partially represents the...used. "* EDDA only supports the SADT methodology and is too limited in scope to be useful in our research. "* SAMM lacks the semantic richness of
Studying the precision of ray tracing techniques with Szekeres models
NASA Astrophysics Data System (ADS)
Koksbang, S. M.; Hannestad, S.
2015-07-01
The simplest standard ray tracing scheme employing the Born and Limber approximations and neglecting lens-lens coupling is used for computing the convergence along individual rays in mock N-body data based on Szekeres swiss cheese and onion models. The results are compared with the exact convergence computed using the exact Szekeres metric combined with the Sachs formalism. A comparison is also made with an extension of the simple ray tracing scheme which includes the Doppler convergence. The exact convergence is reproduced very precisely as the sum of the gravitational and Doppler convergences along rays in Lemaitre-Tolman-Bondi swiss cheese and single void models. This is not the case when the swiss cheese models are based on nonsymmetric Szekeres models. For such models, there is a significant deviation between the exact and ray traced paths and hence also the corresponding convergences. There is also a clear deviation between the exact and ray tracing results obtained when studying both nonsymmetric and spherically symmetric Szekeres onion models.
Guckenberger, Matthias; Klement, Rainer Johannes; Allgäuer, Michael; Appold, Steffen; Dieckmann, Karin; Ernst, Iris; Ganswindt, Ute; Holy, Richard; Nestle, Ursula; Nevinny-Stickel, Meinhard; Semrau, Sabine; Sterzing, Florian; Wittig, Andrea; Andratschke, Nicolaus; Flentje, Michael
2013-10-01
To compare the linear-quadratic (LQ) and the LQ-L formalism (linear cell survival curve beyond a threshold dose dT) for modeling local tumor control probability (TCP) in stereotactic body radiotherapy (SBRT) for stage I non-small cell lung cancer (NSCLC). This study is based on 395 patients from 13 German and Austrian centers treated with SBRT for stage I NSCLC. The median number of SBRT fractions was 3 (range 1-8) and median single fraction dose was 12.5 Gy (2.9-33 Gy); dose was prescribed to the median 65% PTV encompassing isodose (60-100%). Assuming an α/β-value of 10 Gy, we modeled TCP as a sigmoid-shaped function of the biologically effective dose (BED). Models were compared using maximum likelihood ratio tests as well as Bayes factors (BFs). There was strong evidence for a dose-response relationship in the total patient cohort (BFs>20), which was lacking in single-fraction SBRT (BFs<3). Using the PTV encompassing dose or maximum (isocentric) dose, our data indicated a LQ-L transition dose (dT) at 11 Gy (68% CI 8-14 Gy) or 22 Gy (14-42 Gy), respectively. However, the fit of the LQ-L models was not significantly better than a fit without the dT parameter (p=0.07, BF=2.1 and p=0.86, BF=0.8, respectively). Generally, isocentric doses resulted in much better dose-response relationships than PTV encompassing doses (BFs>20). Our data suggest accurate modeling of local tumor control in fractionated SBRT for stage I NSCLC with the traditional LQ formalism. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Models of Behavior Disorder: A Formal Analysis Based on Woods' Taxonomy of Instrumental Conditioning
ERIC Educational Resources Information Center
Tryon, Warren W.
1976-01-01
Among the phenomena covered are superstitious behavior, learned helplessness, experimental neurosis, anaclitic depression as a result of maternal separation, and physiological disturbances such as ulceration. (Author/AM)
The Principal's Role in Site-Based Management.
ERIC Educational Resources Information Center
Drury, William R.
1993-01-01
In existing school-based management models, the principal's role ranges from chairing the local council to being a coach/facilitator. With teachers and parents assuming greater control over governance, curriculum, and budgeting, paranoid principals may establish more formal bargaining relationships with district boards. Caution is advised, because…
NASA Astrophysics Data System (ADS)
Devetak, Iztok; Aleksij Glažar, Saša
2010-08-01
Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.
2012-01-01
Background Globally, extending financial protection and equitable access to health services to those outside the formal sector employment is a major challenge for achieving universal coverage. While some favour contributory schemes, others have embraced tax-funded health service cover for those outside the formal sector. This paper critically examines the issue of how to cover those outside the formal sector through the lens of stakeholder views on the proposed one-time premium payment (OTPP) policy in Ghana. Discussion Ghana in 2004 implemented a National Health Insurance Scheme, based on a contributory model where service benefits are restricted to those who contribute (with some groups exempted from contributing), as the policy direction for moving towards universal coverage. In 2008, the OTPP system was proposed as an alternative way of ensuring coverage for those outside formal sector employment. There are divergent stakeholder views with regard to the meaning of the one-time premium and how it will be financed and sustained. Our stakeholder interviews indicate that the underlying issue being debated is whether the current contributory NHIS model for those outside the formal employment sector should be maintained or whether services for this group should be tax funded. However, the advantages and disadvantages of these alternatives are not being explored in an explicit or systematic way and are obscured by the considerable confusion about the likely design of the OTPP policy. We attempt to contribute to the broader debate about how best to fund coverage for those outside the formal sector by unpacking some of these issues and pointing to the empirical evidence needed to shed even further light on appropriate funding mechanisms for universal health systems. Summary The Ghanaian debate on OTPP is related to one of the most important challenges facing low- and middle-income countries seeking to achieve a universal health care system. It is critical that there is more extensive debate on the advantages and disadvantages of alternative funding mechanisms, supported by a solid evidence base, and with the policy objective of universal coverage providing the guiding light. PMID:23102454
Generalized large-scale semigeostrophic approximations for the f-plane primitive equations
NASA Astrophysics Data System (ADS)
Oliver, Marcel; Vasylkevych, Sergiy
2016-05-01
We derive a family of balance models for rotating stratified flow in the primitive equation (PE) setting. By construction, the models possess conservation laws for energy and potential vorticity and are formally of the same order of accuracy as Hoskins’ semigeostrophic equations. Our construction is based on choosing a new coordinate frame for the PE variational principle in such a way that the consistently truncated Lagrangian degenerates. We show that the balance relations so obtained are elliptic when the fluid is stably stratified and certain smallness assumptions are satisfied. Moreover, the potential temperature can be recovered from the potential vorticity via inversion of a non-standard Monge-Ampère problem which is subject to the same ellipticity condition. While the present work is entirely formal, we conjecture, based on a careful rewriting of the equations of motion and a straightforward derivative count, that the Cauchy problem for the balance models is well posed subject to conditions on the initial data. Our family of models includes, in particular, the stratified analog of the L 1 balance model of Salmon.
NASA Astrophysics Data System (ADS)
Smets, Quentin; Verreck, Devin; Verhulst, Anne S.; Rooyackers, Rita; Merckling, Clément; Van De Put, Maarten; Simoen, Eddy; Vandervorst, Wilfried; Collaert, Nadine; Thean, Voon Y.; Sorée, Bart; Groeseneken, Guido; Heyns, Marc M.
2014-05-01
Promising predictions are made for III-V tunnel-field-effect transistor (FET), but there is still uncertainty on the parameters used in the band-to-band tunneling models. Therefore, two simulators are calibrated in this paper; the first one uses a semi-classical tunneling model based on Kane's formalism, and the second one is a quantum mechanical simulator implemented with an envelope function formalism. The calibration is done for In0.53Ga0.47As using several p+/intrinsic/n+ diodes with different intrinsic region thicknesses. The dopant profile is determined by SIMS and capacitance-voltage measurements. Error bars are used based on statistical and systematic uncertainties in the measurement techniques. The obtained parameters are in close agreement with theoretically predicted values and validate the semi-classical and quantum mechanical models. Finally, the models are applied to predict the input characteristics of In0.53Ga0.47As n- and p-lineTFET, with the n-lineTFET showing competitive performance compared to MOSFET.
Formal Models of the Network Co-occurrence Underlying Mental Operations.
Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand
2016-06-01
Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition.
Formal Models of the Network Co-occurrence Underlying Mental Operations
Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand
2016-01-01
Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition. PMID:27310288
Bruening, Rebecca A; Strazza, Karen; Nocera, Maryalice; Peek-Asa, Corinne; Casteel, Carri
2015-06-01
Small retail businesses experience high robbery and violent crime rates leading to injury and death. Workplace violence prevention programs (WVPP) based on Crime Prevention Through Environmental Design reduce this risk, but low small business participation limits their effectiveness. Recent dissemination models of occupational safety and health information recommend collaborating with an intermediary organization to engage small businesses. Qualitative interviews with 70 small business operators and 32 representatives of organizations with small business influence were conducted to identify factors and recommendations for improving dissemination of a WVPP. Both study groups recommended promoting WVPPs through personal contacts but differed on other promotion methods and the type of influential groups to target. Small business operators indicated few connections to formal business networks. Dissemination of WVPPs to small businesses may require models inclusive of influential individuals (e.g., respected business owners) as intermediaries to reach small businesses with few formal connections. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Baba-Moussa, Abdel Rahamane
2017-10-01
Matching training to employment in educational reform in Benin: contribution to the development of a new educational model - For a long time, the world of education and training and that of employment have operated in parallel, the former training young people while hoping that the latter would find a way to employ them. Today, in light of increasing levels of youth unemployment, educational reform efforts, in Benin as elsewhere, have come to strongly emphasise the question of employability. It is within this framework that this paper proposes to study the case of Benin, drawing on a body of existing data, in order to develop a "clean-break" educational model based on a holistic vision articulating formal and non-formal education, and to identify successful training-professional integration pathways, among which young people can choose after completing their basic education, according to their skills and aspirations.
A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.
Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe
2011-05-30
Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Acar, Ömer; Patton, Bruce R.
2016-01-01
This study had two research purposes. First, we examined the scientific reasoning gains of prospective science teachers who are concrete, formal, and postformal reasoners in an argumentation-based physics inquiry instruction. Second, we sought conceptual knowledge and achievement gaps between these student groups before and after the instruction.…
Probabilities for time-dependent properties in classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Vanni, Leonardo; Laura, Roberto
2013-05-01
We present a formalism which allows one to define probabilities for expressions that involve properties at different times for classical and quantum systems and we study its lattice structure. The formalism is based on the notion of time translation of properties. In the quantum case, the properties involved should satisfy compatibility conditions in order to obtain well-defined probabilities. The formalism is applied to describe the double-slit experiment.
ERIC Educational Resources Information Center
Randler, Christoph; Kummer, Barbara; Wilhelm, Christian
2012-01-01
The aim of this study was to assess the outcome of a zoo visit in terms of learning and retention of knowledge concerning the adaptations and behavior of vertebrate species. Basis of the work was the concept of implementing zoo visits as an out-of-school setting for formal, curriculum based learning. Our theoretical framework centers on the…
A Synergistic Approach to Faculty Mentoring
ERIC Educational Resources Information Center
Goodwin, Laura D.
2004-01-01
Following a comparison of two approaches to mentoring--the traditional model and a relatively new "synergistic" or co-mentoring model--a new formal mentoring program for faculty in the School of Education at the University of Colorado at Denver, based on the synergistic approach, is described. First-year program evaluation data revealed…
Explaining the effect of education on health: a field study in Ghana.
Peters, Ellen; Baker, David P; Dieckmann, Nathan F; Leon, Juan; Collins, John
2010-10-01
Higher education (or more years of formal schooling) is widely associated with better health, but the underlying causes of this association are unclear. In this study, we tested our schooling-decision-making model, which posits that formal education fosters intellectual ability, which in turn provides individuals with enduring competencies to support better health-related behaviors. Using data from a field study on formal education in 181 adults in rural Ghana, we examined health-protective behaviors related to HIV/AIDS infection, a critical health issue in Ghana. As expected, individuals with more education practiced more protective health behaviors. Our structural equation modeling analysis showed that cognitive abilities, numeracy, and decision-making abilities increased with exposure to schooling, and that these enhanced abilities (and not HIV/AIDS knowledge) mediated the effects of education on health-protective behavior. Research and policy implications for HIV prevention efforts in sub-Saharan Africa are discussed.
Chen, Wen; Yu, Chao; Dong, Danan; Cai, Miaomiao; Zhou, Feng; Wang, Zhiren; Zhang, Lei; Zheng, Zhengqi
2017-02-20
With multi-antenna synchronized global navigation satellite system (GNSS) receivers, the single difference (SD) between two antennas is able to eliminate both satellite and receiver clock error, thus it becomes necessary to reconsider the equivalency problem between the SD and double difference (DD) models. In this paper, we quantitatively compared the formal uncertainties and dispersions between multiple SD models and the DD model, and also carried out static and kinematic short baseline experiments. The theoretical and experimental results show that under a non-common clock scheme the SD and DD model are equivalent. Under a common clock scheme, if we estimate stochastic uncalibrated phase delay (UPD) parameters every epoch, this SD model is still equivalent to the DD model, but if we estimate only one UPD parameter for all epochs or take it as a known constant, the SD (here called SD2) and DD models are no longer equivalent. For the vertical component of baseline solutions, the formal uncertainties of the SD2 model are two times smaller than those of the DD model, and the dispersions of the SD2 model are even more than twice smaller than those of the DD model. In addition, to obtain baseline solutions, the SD2 model requires a minimum of three satellites, while the DD model requires a minimum of four satellites, which makes the SD2 more advantageous in attitude determination under sheltered environments.
Chen, Wen; Yu, Chao; Dong, Danan; Cai, Miaomiao; Zhou, Feng; Wang, Zhiren; Zhang, Lei; Zheng, Zhengqi
2017-01-01
With multi-antenna synchronized global navigation satellite system (GNSS) receivers, the single difference (SD) between two antennas is able to eliminate both satellite and receiver clock error, thus it becomes necessary to reconsider the equivalency problem between the SD and double difference (DD) models. In this paper, we quantitatively compared the formal uncertainties and dispersions between multiple SD models and the DD model, and also carried out static and kinematic short baseline experiments. The theoretical and experimental results show that under a non-common clock scheme the SD and DD model are equivalent. Under a common clock scheme, if we estimate stochastic uncalibrated phase delay (UPD) parameters every epoch, this SD model is still equivalent to the DD model, but if we estimate only one UPD parameter for all epochs or take it as a known constant, the SD (here called SD2) and DD models are no longer equivalent. For the vertical component of baseline solutions, the formal uncertainties of the SD2 model are two times smaller than those of the DD model, and the dispersions of the SD2 model are even more than twice smaller than those of the DD model. In addition, to obtain baseline solutions, the SD2 model requires a minimum of three satellites, while the DD model requires a minimum of four satellites, which makes the SD2 more advantageous in attitude determination under sheltered environments. PMID:28230753
The equivalence of a human observer and an ideal observer in binary diagnostic tasks
NASA Astrophysics Data System (ADS)
He, Xin; Samuelson, Frank; Gallas, Brandon D.; Sahiner, Berkman; Myers, Kyle
2013-03-01
The Ideal Observer (IO) is "ideal" for given data populations. In the image perception process, as the raw images are degraded by factors such as display and eye optics, there is an equivalent IO (EIO). The EIO uses the statistical information that exits the perception/cognitive degradations as the data. We assume a human observer who received sufficient training, e.g., radiologists, and hypothesize that such a human observer can be modeled as if he is an EIO. To measure the likelihood ratio (LR) distributions of an EIO, we formalize experimental design principles that encourage rationality based on von Neumann and Morgenstern's (vNM) axioms. We present examples to show that many observer study design refinements, although motivated by empirical principles explicitly, implicitly encourage rationality. Our hypothesis is supported by a recent review paper on ROC curve convexity by Pesce, Metz, and Berbaum. We also provide additional evidence based on a collection of observer studies in medical imaging. EIO theory shows that the "sub-optimal" performance of a human observer can be mathematically formalized in the form of an IO, and measured through rationality encouragement.
van Ments, Laila; Roelofsma, Peter; Treur, Jan
2018-01-01
Religion is a central aspect of many individuals' lives around the world, and its influence on human behaviour has been extensively studied from many different perspectives. The current study integrates a number of these perspectives into one adaptive temporal-causal network model describing the mental states involved, their mutual relations, and the adaptation of some of these relations over time due to learning. By first developing a conceptual representation of a network model based on the literature, and then formalizing this model into a numerical representation, simulations can be done for almost any kind of religion and person, showing different behaviours for persons with different religious backgrounds and characters. The focus was mainly on the influence of religion on human empathy and dis-empathy, a topic very relevant today. The developed model could be valuable for many uses, involving support for a better understanding, and even prediction, of the behaviour of religious individuals. It is illustrated for a number of different scenarios based on different characteristics of the persons and of the religion.
Exact analysis of the spectral properties of the anisotropic two-bosons Rabi model
NASA Astrophysics Data System (ADS)
Cui, Shuai; Cao, Jun-Peng; Fan, Heng; Amico, Luigi
2017-05-01
We introduce the anisotropic two-photon Rabi model in which the rotating and counter rotating terms enters the Hamiltonian with two different coupling constants. Eigenvalues and eigenvectors are studied with exact means. We employ a variation of the Braak method based on Bogolubov rotation of the underlying su(1, 1) Lie algebra. Accordingly, the spectrum is provided by the analytical properties of a suitable meromorphic function. Our formalism applies to the two-modes Rabi model as well, sharing the same algebraic structure of the two-photon model. Through the analysis of the spectrum, we discover that the model displays close analogies to many-body systems undergoing quantum phase transitions.
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Calvo, Gilbert
Various educators from Latin and Central America and the Caribbean met to design and produce materials for teaching family life, human sexuality, community life, and environmental studies. They concluded that the materials should meet community standards; help prepare for future change; develop working models for designing effective teaching…
Formal Verification at System Level
NASA Astrophysics Data System (ADS)
Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.
2009-05-01
System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.
MDA-based EHR application security services.
Blobel, Bernd; Pharow, Peter
2004-01-01
Component-oriented, distributed, virtual EHR systems have to meet enhanced security and privacy requirements. In the context of advanced architectural paradigms such as component-orientation, model-driven, and knowledge-based, standardised security services needed have to be specified and implemented in an integrated way following the same paradigm. This concerns the deployment of formal models, meta-languages, reference models such as the ISO RM-ODP, and development as well as implementation tools. International projects' results presented proceed on that streamline.
Multi-level and hybrid modelling approaches for systems biology.
Bardini, R; Politano, G; Benso, A; Di Carlo, S
2017-01-01
During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.
Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.
2012-01-01
Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239
The stochastic system approach for estimating dynamic treatments effect.
Commenges, Daniel; Gégout-Petit, Anne
2015-10-01
The problem of assessing the effect of a treatment on a marker in observational studies raises the difficulty that attribution of the treatment may depend on the observed marker values. As an example, we focus on the analysis of the effect of a HAART on CD4 counts, where attribution of the treatment may depend on the observed marker values. This problem has been treated using marginal structural models relying on the counterfactual/potential response formalism. Another approach to causality is based on dynamical models, and causal influence has been formalized in the framework of the Doob-Meyer decomposition of stochastic processes. Causal inference however needs assumptions that we detail in this paper and we call this approach to causality the "stochastic system" approach. First we treat this problem in discrete time, then in continuous time. This approach allows incorporating biological knowledge naturally. When working in continuous time, the mechanistic approach involves distinguishing the model for the system and the model for the observations. Indeed, biological systems live in continuous time, and mechanisms can be expressed in the form of a system of differential equations, while observations are taken at discrete times. Inference in mechanistic models is challenging, particularly from a numerical point of view, but these models can yield much richer and reliable results.
Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G
2012-01-01
Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.
Grandparenting Roles and Volunteer Activity
Jendrek, Margaret Platt
2016-01-01
Objectives. We examine whether grandparenting roles are related to formal volunteering among older adults. Method. Logistic regression is used to examine the likelihood of volunteering based on grandchild care using data from the 2004 wave of the Health and Retirement Study (n = 13,785). Longitudinal analyses utilize treatment effects models to examine changes in volunteering for grandparents who begin nonresidential grandchild care between the 2004 and 2008 waves (n = 10,811). Results. Results show that grandparents raising coresidential grandchildren have lower odds of volunteering than grandparents providing no regular grandchild care. However, grandparents who provide nonresidential grandchild care are more likely to volunteer than grandparents not providing grandchild care and those raising a coresidential grandchild. Grandparents who provide nonresidential care for grandchildren engage in more volunteering before assuming grandchild care, and their volunteerism increases after becoming a caregiver for a grandchild. Discussion. Consistent with resource theory and the accumulation of roles, providing nonresidential grandchild care may draw grandparents into formal volunteer activity. The lower human capital resources evidenced by grandparents raising coresidential grandchildren may play a role in their lower likelihood of formal volunteering. PMID:24721748
Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.
Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin
2013-08-01
The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.
Statechart Analysis with Symbolic PathFinder
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2012-01-01
We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.
NASA Astrophysics Data System (ADS)
Sloan, H.; Drantch, K.; Steenhuis, J.
2006-12-01
We present an NSF-funded collaborative formal-informal partnership for urban Earth science teacher preparation and professional development. This model brings together The American Museum of Natural History (AMNH) and Brooklyn and Lehman College of the City University of New York (CUNY) to address science-impoverished classrooms that lack highly qualified teachers by focusing on Earth science teacher certification. Project design was based on identified needs in the local communities and schools, careful analysis of content knowledge mastery required for Earth science teacher certification, and existing impediments to certification. The problem-based approach required partners to push policy envelopes and to invent new ways of articulating content and pedagogy at both intra- and inter-institutional levels. One key element of the project is involvement of the local board of education, teachers, and administrators in initial design and ongoing assessment. Project components include formal Earth systems science courses, a summer institute primarily led and delivered by AMNH scientists through an informal series of lectures coupled to workshops led by AMNH educators, a mechanism for assigning course credit for informal experiences, development of new teaching approaches that include teacher action plans and an external program of evaluation. The principal research strand of this project focuses on the resulting model for formal-informal teacher education partnership, the project's impact on participating teachers, policy issues surrounding the model and the changes required for its development and implementation, and its potential for Earth science education reform. As the grant funded portion of the project draws to a close we begin to analyze data collected over the past 3 years. Third-year findings of the project's external evaluation indicate that the problem-based approach has been highly successful, particularly its impact on participating teachers. In addition to presenting these results, participating teachers from the 2004 and 2006 cohorts discuss their TRUST experiences and the subsequent impact the program has had on their respective Earth science teaching practices and professional lives.
ERIC Educational Resources Information Center
Teelken, Christine; Driessen, Geert; Smit, Frederik
2005-01-01
This contribution is based on comparative case studies of secondary schools in England, the Netherlands and Scotland. The authors conclude that although opportunities for school choice are offered in a formal sense in each of the locations studied, in certain cases choice is not particularly encouraged. In order to explain this disparity between…
Initiating Formal Requirements Specifications with Object-Oriented Models
NASA Technical Reports Server (NTRS)
Ampo, Yoko; Lutz, Robyn R.
1994-01-01
This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.
A formal model of interpersonal inference
Moutoussis, Michael; Trujillo-Barreto, Nelson J.; El-Deredy, Wael; Dolan, Raymond J.; Friston, Karl J.
2014-01-01
Introduction: We propose that active Bayesian inference—a general framework for decision-making—can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: (1) Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to “mentalizing” in the psychological literature, is based upon the outcomes of interpersonal exchanges. (2) We show how some well-known social-psychological phenomena (e.g., self-serving biases) can be explained in terms of active interpersonal inference. (3) Mentalizing naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one's own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modeling intersubject variability in mentalizing during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalizing is distorted. PMID:24723872
Canonical formalism for modelling and control of rigid body dynamics.
Gurfil, P
2005-12-01
This paper develops a new paradigm for stabilization of rigid-body dynamics. The state-space model is formulated using canonical elements, known as the Serret-Andoyer (SA) variables, thus far scarcely used for engineering applications. The main feature of the SA formalism is the reduction of the dynamics via the underlying symmetry stemming from conservation of angular momentum and rotational kinetic energy. The controllability of the system model is examined using the notion of accessibility, and is shown to be accessible from all points. Based on the accessibility proof, two nonlinear asymptotic feedback stabilizers are developed: a damping feedback is designed based on the Jurdjevic-Quinn method, and a Hamiltonian controller is derived by using the Hamiltonian as a natural Lyapunov function for the closed-loop dynamics. It is shown that the Hamiltonian control is both passive and inverse optimal with respect to a meaningful performance index. The performance of the new controllers is examined and compared using simulations of realistic scenarios from the satellite attitude dynamics field.
NASA Astrophysics Data System (ADS)
Collart, T. G.; Stacey, W. M.
2015-11-01
Several methods are presented for extending the traditional analytic ``circular'' representation of flux-surface aligned curvilinear coordinate systems to more accurately describe equilibrium plasma geometry and magnetic fields in DIII-D. The formalism originally presented by Miller is extended to include different poloidal variations in the upper and lower hemispheres. A coordinate system based on separate Fourier expansions of major radius and vertical position greatly improves accuracy in edge plasma structure representation. Scale factors and basis vectors for a system formed by expanding the circular model minor radius can be represented using linear combinations of Fourier basis functions. A general method for coordinate system orthogonalization is presented and applied to all curvilinear models. A formalism for the magnetic field structure in these curvilinear models is presented, and the resulting magnetic field predictions are compared against calculations performed in a Cartesian system using an experimentally based EFIT prediction for the Grad-Shafranov equilibrium. Supported by: US DOE under DE-FG02-00ER54538.
Optimized Temporal Monitors for SystemC
NASA Technical Reports Server (NTRS)
Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.
2012-01-01
SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.
Teacher Leaders' Work with Peers in a Quasi-Formal Teacher Leadership Model
ERIC Educational Resources Information Center
Supovitz, Jonathan A.
2018-01-01
Building on evolving conceptions of teacher leadership in the literature, this article argues that an integration of both positional and empowering elements of teacher leadership are the seeds of an evolved approach to teacher leadership for instructional improvement. Using data from a study of quasi-formal teacher leadership, the research…
Formalization of the Access Control on ARM-Android Platform with the B Method
NASA Astrophysics Data System (ADS)
Ren, Lu; Wang, Wei; Zhu, Xiaodong; Man, Yujia; Yin, Qing
2018-01-01
ARM-Android is a widespread mobile platform with multi-layer access control mechanisms, security-critical in the system. Many access control vulnerabilities still exist due to the course-grained policy and numerous engineering defects, which have been widely studied. However, few researches focus on the mechanism formalization, including the Android permission framework, kernel process management and hardware isolation. This paper first develops a comprehensive formal access control model on the ARM-Android platform using the B method, from the Android middleware to hardware layer. All the model specifications are type checked and proved to be well-defined, with 75%of proof obligations demonstrated automatically. The results show that the proposed B model is feasible to specify and verify access control schemes in the ARM-Android system, and capable of implementing a practical control module.
Stochastic Game Analysis and Latency Awareness for Self-Adaptation
2014-01-01
this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to
ERIC Educational Resources Information Center
Haas, Lory E.
2011-01-01
Three main purposes provided the foundation for this study. The first purpose was to investigate academic achievement through analyses of data obtained through formal and informal assessments among kindergarten through eighth grade students who participated in a Head Start program, center-based care program, or home-based care prior to school…
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
Essers, Geurt; Van Weel-Baumgarten, Evelyn; Bolhuis, Sanneke
2012-01-01
Medical students learn professional communication through formal training and in clinical practice. Physicians working in clinical practice have a powerful influence on student learning. However, they may demonstrate communication behaviours not aligning with recommendations in training programs. This study aims to identify more precisely what differences students perceive between role model communication behaviour during clerkships and formal training. In a cross-sectional study, data were collected about physicians' communication performance as perceived by students. Students filled out a questionnaire in four different clerkships in their fourth and fifth year. Just over half of the students reported communication similar to formal training. This was especially true for students in the later clerkships (paediatrics and primary care). Good examples were seen in providing information corresponding to patients' needs and in shared decision making, although students often noted that in fact the doctor made the decision. Bad examples were observed in exploring cognitions and emotions, and in providing information meeting patient's pace. Further study is needed on actual physician behaviour in clinical practice. From our results, we conclude that students need help in reflecting on and learning from the gap in communication patterns they observe in training versus clinical practice.
A graph grammar approach to artificial life.
Kniemeyer, Ole; Buck-Sorlin, Gerhard H; Kurth, Winfried
2004-01-01
We present the high-level language of relational growth grammars (RGGs) as a formalism designed for the specification of ALife models. RGGs can be seen as an extension of the well-known parametric Lindenmayer systems and contain rule-based, procedural, and object-oriented features. They are defined as rewriting systems operating on graphs with the edges coming from a set of user-defined relations, whereas the nodes can be associated with objects. We demonstrate their ability to represent genes, regulatory networks of metabolites, and morphologically structured organisms, as well as developmental aspects of these entities, in a common formal framework. Mutation, crossing over, selection, and the dynamics of a network of gene regulation can all be represented with simple graph rewriting rules. This is demonstrated in some detail on the classical example of Dawkins' biomorphs and the ABC model of flower morphogenesis: other applications are briefly sketched. An interactive program was implemented, enabling the execution of the formalism and the visualization of the results.
Principles of proteome allocation are revealed using proteomic data and genome-scale models
Yang, Laurence; Yurkovich, James T.; Lloyd, Colton J.; Ebrahim, Ali; Saunders, Michael A.; Palsson, Bernhard O.
2016-01-01
Integrating omics data to refine or make context-specific models is an active field of constraint-based modeling. Proteomics now cover over 95% of the Escherichia coli proteome by mass. Genome-scale models of Metabolism and macromolecular Expression (ME) compute proteome allocation linked to metabolism and fitness. Using proteomics data, we formulated allocation constraints for key proteome sectors in the ME model. The resulting calibrated model effectively computed the “generalist” (wild-type) E. coli proteome and phenotype across diverse growth environments. Across 15 growth conditions, prediction errors for growth rate and metabolic fluxes were 69% and 14% lower, respectively. The sector-constrained ME model thus represents a generalist ME model reflecting both growth rate maximization and “hedging” against uncertain environments and stresses, as indicated by significant enrichment of these sectors for the general stress response sigma factor σS. Finally, the sector constraints represent a general formalism for integrating omics data from any experimental condition into constraint-based ME models. The constraints can be fine-grained (individual proteins) or coarse-grained (functionally-related protein groups) as demonstrated here. This flexible formalism provides an accessible approach for narrowing the gap between the complexity captured by omics data and governing principles of proteome allocation described by systems-level models. PMID:27857205
Principles of proteome allocation are revealed using proteomic data and genome-scale models
Yang, Laurence; Yurkovich, James T.; Lloyd, Colton J.; ...
2016-11-18
Integrating omics data to refine or make context-specific models is an active field of constraint-based modeling. Proteomics now cover over 95% of the Escherichia coli proteome by mass. Genome-scale models of Metabolism and macromolecular Expression (ME) compute proteome allocation linked to metabolism and fitness. Using proteomics data, we formulated allocation constraints for key proteome sectors in the ME model. The resulting calibrated model effectively computed the “generalist” (wild-type) E. coli proteome and phenotype across diverse growth environments. Across 15 growth conditions, prediction errors for growth rate and metabolic fluxes were 69% and 14% lower, respectively. The sector-constrained ME model thusmore » represents a generalist ME model reflecting both growth rate maximization and “hedging” against uncertain environments and stresses, as indicated by significant enrichment of these sectors for the general stress response sigma factor σS. Finally, the sector constraints represent a general formalism for integrating omics data from any experimental condition into constraint-based ME models. The constraints can be fine-grained (individual proteins) or coarse-grained (functionally-related protein groups) as demonstrated here. Furthermore, this flexible formalism provides an accessible approach for narrowing the gap between the complexity captured by omics data and governing principles of proteome allocation described by systems-level models.« less
A knowledge based software engineering environment testbed
NASA Technical Reports Server (NTRS)
Gill, C.; Reedy, A.; Baker, L.
1985-01-01
The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
Can Critical Management Education Be Critical in a Formal Higher Educational Setting?
ERIC Educational Resources Information Center
Choo, Kok Leong
2007-01-01
This article attempts to establish the scope of critical management education in a formal educational setting. It is based on an empirical study of 24 academic staffs' experience of engaging critical management education in four UK University Business Schools. The study seems to show that there are significant barriers to and potential pitfalls in…
Mapping the work-based learning of novice teachers: charting some rich terrain.
Cook, Vivien
2009-12-01
Work-based non-formal learning plays a key role in faculty development yet these processes are yet to be described in detail in medical education. This study sets out to illuminate these processes so that potential benefits for new and inexperienced medical educators and their mentors can be realised. The non-formal learning processes of 12 novice teachers were investigated across hospital, general practice and medical school settings. The research sought to describe 'what' and 'how' non-formal learning takes place, and whether these processes differ across teaching sites. Both clinical and non-clinical teachers of medical undergraduates from one inner city medical school were recruited for the study. Through semi-structured interviews and a 'concept map', participants were asked to identify the people and tasks which they considered central to helping them become more expert as educators. Results identified non-formal learning across a number of key dimensions, including personal development, task and role performance, and optimising clinical teaching. This learning takes place as an outcome of experience, observation, reflection and student feedback. Non-formal learning is a significant aspect of the development of novice teachers and as such it needs to be placed more firmly upon the agenda of faculty development.
A workforce in crisis: a case study to expand allied ophthalmic personnel.
Astle, William; Simms, Craig; Anderson, Lynn
2016-08-01
To examine how the development of allied ophthalmic personnel training programs affects human resource capacity. Using a qualitative case study method conducted at a single Ontario institution, this article describes 6 years of establishing a 2-tiered allied ophthalmic personnel training program. The Kingston Ophthalmic Training Centre participated in the study with 8 leadership and program graduate interviews. To assess regional eye health workforce needs, a case study and iterative process used triangulations of the literature, case study, and qualitative interviews with stakeholders. This research was used to develop a model for establishing allied ophthalmic personnel training programs that would result in expanding human resource capacity. Current human resource capacity development and deployment is inadequate to provide the needed eye care services in Canada. A competency-based curriculum and accreditation model as the platform to develop formal academic training programs is essential. Access to quality eye care and patient services can be met by task-shifting from ophthalmologists to appropriately trained allied ophthalmic personnel. Establishing formal training programs is one important strategy to supplying a well-skilled, trained, and qualified ophthalmic workforce. This initiative meets the criteria required for quality, relevance, equity, and cost-effectiveness to meet the future demands for ophthalmic patient care. Copyright © 2016 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
Formal Representations of Eligibility Criteria: A Literature Review
Weng, Chunhua; Tu, Samson W.; Sim, Ida; Richesson, Rachel
2010-01-01
Standards-based, computable knowledge representations for eligibility criteria are increasingly needed to provide computer-based decision support for automated research participant screening, clinical evidence application, and clinical research knowledge management. We surveyed the literature and identified five aspects of eligibility criteria knowledge representations that contribute to the various research and clinical applications: the intended use of computable eligibility criteria, the classification of eligibility criteria, the expression language for representing eligibility rules, the encoding of eligibility concepts, and the modeling of patient data. We consider three of them (expression language, codification of eligibility concepts, and patient data modeling), to be essential constructs of a formal knowledge representation for eligibility criteria. The requirements for each of the three knowledge constructs vary for different use cases, which therefore should inform the development and choice of the constructs toward cost-effective knowledge representation efforts. We discuss the implications of our findings for standardization efforts toward sharable knowledge representation of eligibility criteria. PMID:20034594
Heuristics, Interactions, and Status Hierarchies: An Agent-Based Model of Deference Exchange
ERIC Educational Resources Information Center
Manzo, Gianluca; Baldassarri, Delia
2015-01-01
Since Merton's classical analysis of cumulative advantage in science, it has been observed that status hierarchies display a sizable disconnect between actors' quality and rank and that they become increasingly asymmetric over time, without, however, turning into winner-take-all structures. In recent years, formal models of status hierarchies…
ERIC Educational Resources Information Center
DeStefano, Joseph; Moore, Audrey-Marie Schuh; Balwanz, David; Hartwell, Ash
2007-01-01
In 2004, the United States Agency for International Development (USAID)-funded Educational Quality Improvement Program 2 (EQUIP2) began investigating community-based schools as a mechanism for reaching the underserved populations. The team identified nine models that successfully organized schooling in regions least served by the formal education…
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data
NASA Astrophysics Data System (ADS)
Yu, Q.; Helmholz, P.; Belton, D.; West, G.
2014-04-01
The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.
NASA Astrophysics Data System (ADS)
Grytskyy, Dmytro; Diesmann, Markus; Helias, Moritz
2016-06-01
Self-organized structures in networks with spike-timing dependent synaptic plasticity (STDP) are likely to play a central role for information processing in the brain. In the present study we derive a reaction-diffusion-like formalism for plastic feed-forward networks of nonlinear rate-based model neurons with a correlation sensitive learning rule inspired by and being qualitatively similar to STDP. After obtaining equations that describe the change of the spatial shape of the signal from layer to layer, we derive a criterion for the nonlinearity necessary to obtain stable dynamics for arbitrary input. We classify the possible scenarios of signal evolution and find that close to the transition to the unstable regime metastable solutions appear. The form of these dissipative solitons is determined analytically and the evolution and interaction of several such coexistent objects is investigated.
Richter Sundberg, Linda; Garvare, Rickard; Nyström, Monica Elisabeth
2017-05-11
The judgment and decision making process during guideline development is central for producing high-quality clinical practice guidelines, but the topic is relatively underexplored in the guideline research literature. We have studied the development process of national guidelines with a disease-prevention scope produced by the National board of Health and Welfare (NBHW) in Sweden. The NBHW formal guideline development model states that guideline recommendations should be based on five decision-criteria: research evidence; curative/preventive effect size, severity of the condition; cost-effectiveness; and ethical considerations. A group of health profession representatives (i.e. a prioritization group) was assigned the task of ranking condition-intervention pairs for guideline recommendations, taking into consideration the multiple decision criteria. The aim of this study was to investigate the decision making process during the two-year development of national guidelines for methods of preventing disease. A qualitative inductive longitudinal case study approach was used to investigate the decision making process. Questionnaires, non-participant observations of nine two-day group meetings, and documents provided data for the analysis. Conventional and summative qualitative content analysis was used to analyse data. The guideline development model was modified ad-hoc as the group encountered three main types of dilemmas: high quality evidence vs. low adoptability of recommendation; insufficient evidence vs. high urgency to act; and incoherence in assessment and prioritization within and between four different lifestyle areas. The formal guideline development model guided the decision-criteria used, but three new or revised criteria were added by the group: 'clinical knowledge and experience', 'potential guideline consequences' and 'needs of vulnerable groups'. The frequency of the use of various criteria in discussions varied over time. Gender, professional status, and interpersonal skills were perceived to affect individuals' relative influence on group discussions. The study shows that guideline development groups make compromises between rigour and pragmatism. The formal guideline development model incorporated multiple aspects, but offered few details on how the different criteria should be handled. The guideline development model devoted little attention to the role of the decision-model and group-related factors. Guideline development models could benefit from clarifying the role of the group-related factors and non-research evidence, such as clinical experience and ethical considerations, in decision-processes during guideline development.
NASA Astrophysics Data System (ADS)
Nabi, Jameel-Un; Ishfaq, Mavra; Böyükata, Mahmut; Riaz, Muhammad
2017-10-01
At finite temperatures (≥ 107K), 76Se is abundant in the core of massive stars and electron capture on 76Se has a consequential role to play in the dynamics of core-collapse. The present work may be classified into two main categories. In the first phase we study the nuclear structure properties of 76Se using the interacting boson model-1 (IBM-1). The IBM-1 investigations include the energy levels, B (E 2) values and the prediction of the geometry. We performed the extended consistent-Q formalism (ECQF) calculation and later the triaxial formalism calculation (constructed by adding the cubic term to the ECQF). The geometry of 76Se can be envisioned within the formalism of the potential energy surface based on the classical limit of IBM-1 model. In the second phase, we reconfirm the unblocking of the Gamow-Teller (GT) strength in 76Se (a test case for nuclei having N > 40 and Z < 40). Using the deformed pn-QRPA model we calculate GT transitions, stellar electron capture cross section (within the limit of low momentum transfer) and stellar weak rates for 76Se. The distinguishing feature of our calculation is a state-by-state evaluation of stellar weak rates in a fully microscopic fashion. Results are compared with experimental data and previous calculations. The calculated GT distribution fulfills the Ikeda sum rule. Rates for β-delayed neutrons and emission probabilities are also calculated. Our study suggests that at high stellar temperatures and low densities, the β+-decay on 76Se should not be neglected and needs to be taken into consideration along with electron capture rates for simulation of presupernova evolution of massive stars.
Longitudinal Associations Between Formal Volunteering and Cognitive Functioning.
Proulx, Christine M; Curl, Angela L; Ermer, Ashley E
2018-03-02
The present study examines the association between formal volunteering and cognitive functioning over time. We also examine the moderating roles of race, sex, education, and time. Using 11,100 participants aged 51 years and older and nine waves of data from the Health and Retirement Survey, we simultaneously modeled the longitudinal associations between engaging in formal volunteering and changes in cognitive functioning using multilevel models. Formal volunteering was associated with higher levels of cognitive functioning over time, especially with aspects of cognitive functioning related to working memory and processing. This association was stronger for women than it was for men, and for those with below average levels of education. The positive association between formal volunteering and cognitive functioning weakened over time when cognitive functioning was conceptualized as memory, but strengthened over time when conceptualized as working memory and processing. Volunteering is a productive activity that is beneficial not just to society, but to volunteers' levels of cognitive functioning in older age. For women and those with lower levels of education, formal volunteering appears particularly beneficial to working memory and processing. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
Modular analysis of biological networks.
Kaltenbach, Hans-Michael; Stelling, Jörg
2012-01-01
The analysis of complex biological networks has traditionally relied on decomposition into smaller, semi-autonomous units such as individual signaling pathways. With the increased scope of systems biology (models), rational approaches to modularization have become an important topic. With increasing acceptance of de facto modularity in biology, widely different definitions of what constitutes a module have sparked controversies. Here, we therefore review prominent classes of modular approaches based on formal network representations. Despite some promising research directions, several important theoretical challenges remain open on the way to formal, function-centered modular decompositions for dynamic biological networks.
Knowledge bases built on web languages from the point of view of predicate logics
NASA Astrophysics Data System (ADS)
Vajgl, Marek; Lukasová, Alena; Žáček, Martin
2017-06-01
The article undergoes evaluation of formal systems created on the base of web (ontology/concept) languages by simplifying the usual approach of knowledge representation within the FOPL, but sharing its expressiveness, semantic correct-ness, completeness and decidability. Evaluation of two of them - that one based on description logic and that one built on RDF model principles - identifies some of the lacks of those formal systems and presents, if possible, corrections of them. Possibilities to build an inference system capable to obtain new further knowledge over given knowledge bases including those describing domains by giant linked domain databases has been taken into account. Moreover, the directions towards simplifying FOPL language discussed here has been evaluated from the point of view of a possibility to become a web language for fulfilling an idea of semantic web.
Pérès, Sabine; Felicori, Liza; Rialle, Stéphanie; Jobard, Elodie; Molina, Franck
2010-01-01
Motivation: In the available databases, biological processes are described from molecular and cellular points of view, but these descriptions are represented with text annotations that make it difficult to handle them for computation. Consequently, there is an obvious need for formal descriptions of biological processes. Results: We present a formalism that uses the BioΨ concepts to model biological processes from molecular details to networks. This computational approach, based on elementary bricks of actions, allows us to calculate on biological functions (e.g. process comparison, mapping structure–function relationships, etc.). We illustrate its application with two examples: the functional comparison of proteases and the functional description of the glycolysis network. This computational approach is compatible with detailed biological knowledge and can be applied to different kinds of systems of simulation. Availability: www.sysdiag.cnrs.fr/publications/supplementary-materials/BioPsi_Manager/ Contact: sabine.peres@sysdiag.cnrs.fr; franck.molina@sysdiag.cnrs.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20448138
"They just know": the epistemological politics of "evidence-based" non-formal education.
Archibald, Thomas
2015-02-01
Community education and outreach programs should be evidence-based. This dictum seems at once warranted, welcome, and slightly platitudinous. However, the "evidence-based" movement's more narrow definition of evidence--privileging randomized controlled trials as the "gold standard"--has fomented much debate. Such debate, though insightful, often lacks grounding in actual practice. To address that lack, the purpose of the study presented in this paper was to examine what actually happens, in practice, when people support the implementation of evidence-based programs (EBPs) or engage in related efforts to make non-formal education more "evidence-based." Focusing on three cases--two adolescent sexual health projects (one in the United States and one in Kenya) and one more general youth development organization--I used qualitative methods to address the questions: (1) How is evidence-based program and evidence-based practice work actually practiced? (2) What perspectives and assumptions about what non-formal education is are manifested through that work? and (3) What conflicts and tensions emerge through that work related to those perspectives and assumptions? Informed by theoretical perspectives on the intersection of science, expertise, and democracy, I conclude that the current dominant approach to making non-formal education more evidence-based by way of EBPs is seriously flawed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Weaver, Sallie J; Mossburg, Sarah E; Pillari, MarieSarah; Kent, Paula S; Daugherty Biddison, Elizabeth Lee
This study explored similarities and differences in the views on team membership and leadership held by nurses in formal unit leadership positions and direct care nurses. We used a mixed-methods approach and a maximum variance sampling strategy, sampling from units with both high and low safety behaviors and safety culture scores. We identified several key differences in mental models of care team membership and leadership between formal leaders and direct care nurses that warrant further exploration.
Discounting of reward sequences: a test of competing formal models of hyperbolic discounting
Zarr, Noah; Alexander, William H.; Brown, Joshua W.
2014-01-01
Humans are known to discount future rewards hyperbolically in time. Nevertheless, a formal recursive model of hyperbolic discounting has been elusive until recently, with the introduction of the hyperbolically discounted temporal difference (HDTD) model. Prior to that, models of learning (especially reinforcement learning) have relied on exponential discounting, which generally provides poorer fits to behavioral data. Recently, it has been shown that hyperbolic discounting can also be approximated by a summed distribution of exponentially discounted values, instantiated in the μAgents model. The HDTD model and the μAgents model differ in one key respect, namely how they treat sequences of rewards. The μAgents model is a particular implementation of a Parallel discounting model, which values sequences based on the summed value of the individual rewards whereas the HDTD model contains a non-linear interaction. To discriminate among these models, we observed how subjects discounted a sequence of three rewards, and then we tested how well each candidate model fit the subject data. The results show that the Parallel model generally provides a better fit to the human data. PMID:24639662
A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies
Tang, Li
2014-01-01
Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125
Addressing Pediatric Health Concerns through School-Based Consultation
ERIC Educational Resources Information Center
Truscott, Stephen D.; Albritton, Kizzy
2011-01-01
In schools, the term "consultation" has multiple meanings. Often it is used to describe a quick, informal process of advice giving between teachers and/or school specialists. As a formal discipline, School-Based Consultation (SBC) is an indirect service delivery model that involves two or more parties working together to benefit students. Most…
Crane, Catherine; Crane, Rebecca S.; Eames, Catrin; Fennell, Melanie J.V.; Silverton, Sarah; Williams, J. Mark G.; Barnhofer, Thorsten
2014-01-01
Few empirical studies have explored the associations between formal and informal mindfulness home practice and outcome in Mindfulness-based Cognitive Therapy (MBCT). In this study ninety-nine participants randomised to MBCT in a multi-centre randomised controlled trial completed self-reported ratings of home practice over 7 treatment weeks. Recurrence of Major Depression was assessed immediately after treatment, and at 3, 6, 9, and 12-months post-treatment. Results identified a significant association between mean daily duration of formal home practice and outcome and additionally indicated that participants who reported that they engaged in formal home practice on at least 3 days a week during the treatment phase were almost half as likely to relapse as those who reported fewer days of formal practice. These associations were independent of the potentially confounding variable of participant-rated treatment plausibility. The current study identified no significant association between informal home practice and outcome, although this may relate to the inherent difficulties in quantifying informal home mindfulness practice. These findings have important implications for clinicians discussing mindfulness-based interventions with their participants, in particular in relation to MBCT, where the amount of participant engagement in home practice appears to have a significant positive impact on outcome. PMID:25261599
ERIC Educational Resources Information Center
Raffo, Carlo; O'Connor, Justin; Lovatt, Andy; Banks, Mark
2000-01-01
Presents arguments supporting a social model of learning linked to situated learning and cultural capital. Critiques training methods used in cultural industries (arts, publishing, broadcasting, design, fashion, restaurants). Uses case study evidence to demonstrates inadequacies of formal training in this sector. (Contains 49 references.) (SK)
"Taking Charge of One's Life": A Model for Weight Management Success
ERIC Educational Resources Information Center
Adams, Marlene
2008-01-01
Obesity is a serious, prevalent, and refractory disorder that increases with age particularly in women who enroll in formal weight loss treatments. This study examined the processes used by obese postmenopausal women as they participated in a formal weight loss program. Using grounded theory, interviews were conducted with 14 women engaged in a…
A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony; Munoz, Cesar
2015-01-01
In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.
Simulating Technology Processes to Foster Learning.
ERIC Educational Resources Information Center
Krumholtz, Nira
1998-01-01
Based on a spiral model of technology evolution, elementary students used LOGO computer software to become both developers and users of technology. The computerized environment enabled 87% to reach intuitive understanding of physical concepts; 24% expressed more formal scientific understanding. (SK)
Inductive Reasoning about Causally Transmitted Properties
ERIC Educational Resources Information Center
Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D.; Tenenbaum, Joshua B.
2008-01-01
Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates'…
A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States
ERIC Educational Resources Information Center
Gong, Xin; Wang, Pengcheng
2017-01-01
This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…
McAlearney, Ann Scheck; Hefner, Jennifer L; Sieck, Cynthia; Rizer, Milisa; Huerta, Timothy R
2014-07-01
While electronic health record (EHR) systems have potential to drive improvements in healthcare, a majority of EHR implementations fall short of expectations. Shortcomings in implementations are often due to organizational issues around the implementation process rather than technological problems. Evidence from both the information technology and healthcare management literature can be applied to improve the likelihood of implementation success, but the translation of this evidence into practice has not been widespread. Our objective was to comprehensively study and synthesize best practices for managing ambulatory EHR system implementation in healthcare organizations, highlighting applicable management theories and successful strategies. We held 45 interviews with key informants in six U.S. healthcare organizations purposively selected based on reported success with ambulatory EHR implementation. We also conducted six focus groups comprised of 37 physicians. Interview and focus group transcripts were analyzed using both deductive and inductive methods to answer research questions and explore emergent themes. We suggest that successful management of ambulatory EHR implementation can be guided by the Plan-Do-Study-Act (PDSA) quality improvement (QI) model. While participants did not acknowledge nor emphasize use of this model, we found evidence that successful implementation practices could be framed using the PDSA model. Additionally, successful sites had three strategies in common: 1) use of evidence from published health information technology (HIT) literature emphasizing implementation facilitators; 2) focusing on workflow; and 3) incorporating critical management factors that facilitate implementation. Organizations seeking to improve ambulatory EHR implementation processes can use frameworks such as the PDSA QI model to guide efforts and provide a means to formally accommodate new evidence over time. Implementing formal management strategies and incorporating new evidence through the PDSA model is a key element of evidence-based management and a crucial way for organizations to position themselves to proactively address implementation and use challenges before they are exacerbated. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Non-Markovian closure models for large eddy simulations using the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric J.; Duraisamy, Karthik
2017-01-01
This work uses the Mori-Zwanzig (M-Z) formalism, a concept originating from nonequilibrium statistical mechanics, as a basis for the development of coarse-grained models of turbulence. The mechanics of the generalized Langevin equation (GLE) are considered, and insight gained from the orthogonal dynamics equation is used as a starting point for model development. A class of subgrid models is considered which represent nonlocal behavior via a finite memory approximation [Stinis, arXiv:1211.4285 (2012)], the length of which is determined using a heuristic that is related to the spectral radius of the Jacobian of the resolved variables. The resulting models are intimately tied to the underlying numerical resolution and are capable of approximating non-Markovian effects. Numerical experiments on the Burgers equation demonstrate that the M-Z-based models can accurately predict the temporal evolution of the total kinetic energy and the total dissipation rate at varying mesh resolutions. The trajectory of each resolved mode in phase space is accurately predicted for cases where the coarse graining is moderate. Large eddy simulations (LESs) of homogeneous isotropic turbulence and the Taylor-Green Vortex show that the M-Z-based models are able to provide excellent predictions, accurately capturing the subgrid contribution to energy transfer. Last, LESs of fully developed channel flow demonstrate the applicability of M-Z-based models to nondecaying problems. It is notable that the form of the closure is not imposed by the modeler, but is rather derived from the mathematics of the coarse graining, highlighting the potential of M-Z-based techniques to define LES closures.
Description logic-based methods for auditing frame-based medical terminological systems.
Cornet, Ronald; Abu-Hanna, Ameen
2005-07-01
Medical terminological systems (TSs) play an increasingly important role in health care by supporting recording, retrieval and analysis of patient information. As the size and complexity of TSs are growing, the need arises for means to audit them, i.e. verify and maintain (logical) consistency and (semantic) correctness of their contents. This is not only important for the management of TSs but also for providing their users with confidence about the reliability of their contents. Formal methods have the potential to play an important role in the audit of TSs, although there are few empirical studies to assess the benefits of using these methods. In this paper we propose a method based on description logics (DLs) for the audit of TSs. This method is based on the migration of the medical TS from a frame-based representation to a DL-based one. Our method is characterized by a process in which initially stringent assumptions are made about concept definitions. The assumptions allow the detection of concepts and relations that might comprise a source of logical inconsistency. If the assumptions hold then definitions are to be altered to eliminate the inconsistency, otherwise the assumptions are revised. In order to demonstrate the utility of the approach in a real-world case study we audit a TS in the intensive care domain and discuss decisions pertaining to building DL-based representations. This case study demonstrates that certain types of inconsistencies can indeed be detected by applying the method to a medical terminological system. The added value of the method described in this paper is that it provides a means to evaluate the compliance to a number of common modeling principles in a formal manner. The proposed method reveals potential modeling inconsistencies, helping to audit and (if possible) improve the medical TS. In this way, it contributes to providing confidence in the contents of the terminological system.
Viscous warm inflation: Hamilton-Jacobi formalism
NASA Astrophysics Data System (ADS)
Akhtari, L.; Mohammadi, A.; Sayar, K.; Saaidi, Kh.
2017-04-01
Using Hamilton-Jacobi formalism, the scenario of warm inflation with viscous pressure is considered. The formalism gives a way of computing the slow-rolling parameter without extra approximation, and it is well-known as a powerful method in cold inflation. The model is studied in detail for three different cases of the dissipation and bulk viscous pressure coefficients. In the first case where both coefficients are taken as constant, it is shown that the case could not portray warm inflationary scenario compatible with observational data even it is possible to restrict the model parameters. For other cases, the results shows that the model could properly predicts the perturbation parameters in which they stay in perfect agreement with Planck data. As a further argument, r -ns and αs -ns are drown that show the acquired result could stand in acceptable area expressing a compatibility with observational data.
Informal and Formal Learning of General Practitioners
ERIC Educational Resources Information Center
Spaan, Nadia Roos; Dekker, Anne R. J.; van der Velden, Alike W.; de Groot, Esther
2016-01-01
Purpose: The purpose of this study is to understand the influence of formal learning from a web-based training and informal (workplace) learning afterwards on the behaviour of general practitioners (GPs) with respect to prescription of antibiotics. Design/methodology/approach: To obtain insight in various learning processes, semi-structured…
Testing the effectiveness of family therapeutic assessment: a case study using a time-series design.
Smith, Justin D; Wolf, Nicole J; Handler, Leonard; Nash, Michael R
2009-11-01
We describe a family Therapeutic Assessment (TA) case study employing 2 assessors, 2 assessment rooms, and a video link. In the study, we employed a daily measures time-series design with a pretreatment baseline and follow-up period to examine the family TA treatment model. In addition to being an illustrative addition to a number of clinical reports suggesting the efficacy of family TA, this study is the first to apply a case-based time-series design to test whether family TA leads to clinical improvement and also illustrates when that improvement occurs. Results support the trajectory of change proposed by Finn (2007), the TA model's creator, who posits that benefits continue beyond the formal treatment itself.
Modular Knowledge Representation and Reasoning in the Semantic Web
NASA Astrophysics Data System (ADS)
Serafini, Luciano; Homola, Martin
Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.
Improving Learner Outcomes in Lifelong Education: Formal Pedagogies in Non-Formal Learning Contexts?
ERIC Educational Resources Information Center
Zepke, Nick; Leach, Linda
2006-01-01
This article explores how far research findings about successful pedagogies in formal post-school education might be used in non-formal learning contexts--settings where learning may not lead to formal qualifications. It does this by examining a learner outcomes model adapted from a synthesis of research into retention. The article first…
A Jury of Their Peers: A Meta-Analysis of the Effects of Teen Court on Criminal Recidivism.
Bouchard, Jessica; Wong, Jennifer S
2017-07-01
Juvenile delinquency has been on the decline for a number of years, yet, juvenile courts continue to assess more than 1 million cases per year. Involvement with the juvenile justice system has been linked to a number of risk factors and consequences that may impact positive youth development; however, evidence-based correctional programs that divert juvenile offenders away from formal processing are limited. Teen Court is a specialized diversion intervention that offers an alternative to traditional court processing for juvenile offenders. Despite the rapid expansion of Teen Courts, there is little comprehensive and systematic evidence available to justify this expansion. This meta-analytic study examines the effects of Teen Court on the recidivism of juvenile offenders. The literature search resulted in the selection of 14 studies, which contributed 18 unique effect sizes with a total sample of 2125 treatment group and 979 comparison group youth. The findings suggest that Teen Court is no more effective at reducing recidivism than (a) formal processing or (b) other diversion programs. Implications of formal and informal court processing for low-risk, first-time young offenders are discussed. The authors draw on the Risk-Need-Responsivity model to provide recommendations for policies and practices.
Sensitivity analysis of Repast computational ecology models with R/Repast.
Prestes García, Antonio; Rodríguez-Patón, Alfonso
2016-12-01
Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.
NASA Astrophysics Data System (ADS)
Fiorani, D.; Acierno, M.
2017-05-01
The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.
An engineering approach to automatic programming
NASA Technical Reports Server (NTRS)
Rubin, Stuart H.
1990-01-01
An exploratory study of the automatic generation and optimization of symbolic programs using DECOM - a prototypical requirement specification model implemented in pure LISP was undertaken. It was concluded, on the basis of this study, that symbolic processing languages such as LISP can support a style of programming based upon formal transformation and dependent upon the expression of constraints in an object-oriented environment. Such languages can represent all aspects of the software generation process (including heuristic algorithms for effecting parallel search) as dynamic processes since data and program are represented in a uniform format.
Drug target ontology to classify and integrate drug discovery data.
Lin, Yu; Mehta, Saurabh; Küçük-McGinty, Hande; Turner, John Paul; Vidovic, Dusica; Forlin, Michele; Koleti, Amar; Nguyen, Dac-Trung; Jensen, Lars Juhl; Guha, Rajarshi; Mathias, Stephen L; Ursu, Oleg; Stathias, Vasileios; Duan, Jianbin; Nabizadeh, Nooshin; Chung, Caty; Mader, Christopher; Visser, Ubbo; Yang, Jeremy J; Bologa, Cristian G; Oprea, Tudor I; Schürer, Stephan C
2017-11-09
One of the most successful approaches to develop new small molecule therapeutics has been to start from a validated druggable protein target. However, only a small subset of potentially druggable targets has attracted significant research and development resources. The Illuminating the Druggable Genome (IDG) project develops resources to catalyze the development of likely targetable, yet currently understudied prospective drug targets. A central component of the IDG program is a comprehensive knowledge resource of the druggable genome. As part of that effort, we have developed a framework to integrate, navigate, and analyze drug discovery data based on formalized and standardized classifications and annotations of druggable protein targets, the Drug Target Ontology (DTO). DTO was constructed by extensive curation and consolidation of various resources. DTO classifies the four major drug target protein families, GPCRs, kinases, ion channels and nuclear receptors, based on phylogenecity, function, target development level, disease association, tissue expression, chemical ligand and substrate characteristics, and target-family specific characteristics. The formal ontology was built using a new software tool to auto-generate most axioms from a database while supporting manual knowledge acquisition. A modular, hierarchical implementation facilitate ontology development and maintenance and makes use of various external ontologies, thus integrating the DTO into the ecosystem of biomedical ontologies. As a formal OWL-DL ontology, DTO contains asserted and inferred axioms. Modeling data from the Library of Integrated Network-based Cellular Signatures (LINCS) program illustrates the potential of DTO for contextual data integration and nuanced definition of important drug target characteristics. DTO has been implemented in the IDG user interface Portal, Pharos and the TIN-X explorer of protein target disease relationships. DTO was built based on the need for a formal semantic model for druggable targets including various related information such as protein, gene, protein domain, protein structure, binding site, small molecule drug, mechanism of action, protein tissue localization, disease association, and many other types of information. DTO will further facilitate the otherwise challenging integration and formal linking to biological assays, phenotypes, disease models, drug poly-pharmacology, binding kinetics and many other processes, functions and qualities that are at the core of drug discovery. The first version of DTO is publically available via the website http://drugtargetontology.org/ , Github ( http://github.com/DrugTargetOntology/DTO ), and the NCBO Bioportal ( http://bioportal.bioontology.org/ontologies/DTO ). The long-term goal of DTO is to provide such an integrative framework and to populate the ontology with this information as a community resource.
Nanda, P
1999-08-01
Within the overall aim of poverty alleviation, development efforts have included credit and self-employment programmes. In Bangladesh, the major beneficiaries of such group-based credit programmes are rural women who use the loans to initiate small informal income-generating activities. This paper explores the benefits of women's participation in credit programmes on their own health seeking. Using data from a sample of 1798 households from rural Bangladesh, conducted in 1991-1992 through repeated random sampling of 87 districts covered by Grameen Bank, Bangladesh Rural Advancement Committee (BRAC) and Bangladesh Rural Development Board (BRDB), this paper addresses the question: does women's participation in credit programmes significantly affect their use of formal health care? A non-unitary household preference model is suggested to test the hypothesis that women's empowerment through participation in these programmes results in greater control of resources for their own demand for formal health care. The analysis controls for endogeneity due to self-selection and other unobserved village level factors through the use of a weighted two stage instrumental variable approach with village level fixed effects. The findings indicate a positive impact of women's participation in credit programmes on their demand for formal health care. The policy simulations on the results of this study highlight the importance of credit programmes as a health intervention in addition to being a mechanism for women's economic empowerment.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
NASA Astrophysics Data System (ADS)
Attia, Moez; Gueddana, Amor; Chatta, Rihab; Morand, Alain
2013-09-01
The work presented in this paper develops a new formalism to design microdisks and microgears structures. The main objective is to study the optics and geometrics parameters influence on the microdisks and microgears structures resonance behavior. This study is conducted to choice a resonance structure with height quality factor Q to be associated with Quantum dot to form a single photon source. This new method aims to design resonant structures that are simpler and requires less computing performances than FDTD and Floquet Block methods. This formalism is based on simplifying Fourier transformed and using toeplitz matrix writing. This new writing allows designing all kind of resonance structures with any defect and any modification. In other study we have design a quantum dot emitting a photon at 1550 nm of the fundamental mode, but the quantum dot emits other photons at other wavelengths. The focus of the resonant structure and the quantum dot association is the resonance of the photon at 1550 nm and the elimination of all other photons with others energies. The quantum dot studied in [1] is an InAs/GaAs quantum dot, we design an GaAS microdisk and microgear and we compare the quality factor Q of this two structures and we conclude that the microgear is more appropriated to be associate to the quantum dot and increase the probability P1 to obtain a single photon source at 1550 nm and promotes the obtaining of single photon. The performance improving of the resonant structure is able to increase the success of quantum applications such as quantum gates based on single photon source.
Formal Analysis of Key Integrity in PKCS#11
NASA Astrophysics Data System (ADS)
Falcone, Andrea; Focardi, Riccardo
PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.
Information system modeling for biomedical imaging applications
NASA Astrophysics Data System (ADS)
Hoo, Kent S., Jr.; Wong, Stephen T. C.
1999-07-01
Information system modeling has historically been relegated to a low priority among the designers of information systems. Often times, there is a rush to design and implement hardware and software solutions after only the briefest assessments of the domain requirements. Although this process results in a rapid development cycle, the system usually does not satisfy the needs of the users and the developers are forced to re-program certain aspects of the system. It would be much better to create an accurate model of the system based on the domain needs so that the implementation of the solution satisfies the needs of the users immediately. It would also be advantageous to build extensibility into the model so that updates to the system could be carried out in an organized fashion. The significance of this research is the development of a new formal framework for the construction of a multimedia medical information system. This formal framework is constructed using visual modeling which provides a way of thinking about problems using models organized around real- world ideas. These models provide an abstract way to view complex problems, making them easier for one to understand. The formal framework is the result of an object-oriented analysis and design process that translates the systems requirements and functionality into software models. The usefulness of this information framework is demonstrated with two different applications in epilepsy research and care, i.e., surgical planning of epilepsy and decision threshold determination.
Nuclear physics: Macroscopic aspects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiatecki, W.J.
1993-12-01
A systematic macroscopic, leptodermous approach to nuclear statics and dynamics is described, based formally on the assumptions {h_bar} {yields} 0 and b/R << 1, where b is the surface diffuseness and R the nuclear radius. The resulting static model of shell-corrected nuclear binding energies and deformabilities is accurate to better than 1 part in a thousand and yields a firm determination of the principal properties of the nuclear fluid. As regards dynamics, the above approach suggests that nuclear shape evolutions will often be dominated by dissipation, but quantitative comparisons with experimental data are more difficult than in the case ofmore » statics. In its simplest liquid drop version the model exhibits interesting formal connections to the classic astronomical problem of rotating gravitating masses.« less
Modeling of tool path for the CNC sheet cutting machines
NASA Astrophysics Data System (ADS)
Petunin, Aleksandr A.
2015-11-01
In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.
A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)
NASA Astrophysics Data System (ADS)
High, Wayne
1993-03-01
This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.
On the formalization and reuse of scientific research.
King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N
2011-10-07
The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.
Formality of the Chinese collective leadership.
Li, Haiying; Graesser, Arthur C
2016-09-01
We investigated the linguistic patterns in the discourse of four generations of the collective leadership of the Communist Party of China (CPC) from 1921 to 2012. The texts of Mao Zedong, Deng Xiaoping, Jiang Zemin, and Hu Jintao were analyzed using computational linguistic techniques (a Chinese formality score) to explore the persuasive linguistic features of the leaders in the contexts of power phase, the nation's education level, power duration, and age. The study was guided by the elaboration likelihood model of persuasion, which includes a central route (represented by formal discourse) versus a peripheral route (represented by informal discourse) to persuasion. The results revealed that these leaders adopted the formal, central route more when they were in power than before they came into power. The nation's education level was a significant factor in the leaders' adoption of the persuasion strategy. The leaders' formality also decreased with their increasing age and in-power times. However, the predictability of these factors for formality had subtle differences among the different types of leaders. These results enhance our understanding of the Chinese collective leadership and the role of formality in politically persuasive messages.
An Introduction to the BFS Method and Its Use to Model Binary NiAl Alloys
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, J.; Amador, C.
1998-01-01
We introduce the Bozzolo-Ferrante-Smith (BFS) method for alloys as a computationally efficient tool for aiding in the process of alloy design. An intuitive description of the BFS method is provided, followed by a formal discussion of its implementation. The method is applied to the study of the defect structure of NiAl binary alloys. The groundwork is laid for a detailed progression to higher order NiAl-based alloys linking theoretical calculations and computer simulations based on the BFS method and experimental work validating each step of the alloy design process.
A Harris-Todaro Agent-Based Model to Rural-Urban Migration
NASA Astrophysics Data System (ADS)
Espíndola, Aquino L.; Silveira, Jaylson J.; Penna, T. J. P.
2006-09-01
The Harris-Todaro model of the rural-urban migration process is revisited under an agent-based approach. The migration of the workers is interpreted as a process of social learning by imitation, formalized by a computational model. By simulating this model, we observe a transitional dynamics with continuous growth of the urban fraction of overall population toward an equilibrium. Such an equilibrium is characterized by stabilization of rural-urban expected wages differential (generalized Harris-Todaro equilibrium condition), urban concentration and urban unemployment. These classic results obtained originally by Harris and Todaro are emergent properties of our model.
Theoretical Studies of Spectroscopic Line Mixing in Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Ma, Q.; Boulet, C.; Tipping, R. H.
2015-01-01
The phenomenon of collisional transfer of intensity due to line mixing has an increasing importance for atmospheric monitoring. From a theoretical point of view, all relevant information about the collisional processes is contained in the relaxation matrix where the diagonal elements give half-widths and shifts, and the off-diagonal elements correspond to line interferences. For simple systems such as those consisting of diatom-atom or diatom-diatom, accurate fully quantum calculations based on interaction potentials are feasible. However, fully quantum calculations become unrealistic for more complex systems. On the other hand, the semi-classical Robert-Bonamy (RB) formalism, which has been widely used to calculate half-widths and shifts for decades, fails in calculating the off-diagonal matrix elements. As a result, in order to simulate atmospheric spectra where the effects from line mixing are important, semi-empirical fitting or scaling laws such as the ECS (Energy-Corrected Sudden) and IOS (Infinite-Order Sudden) models are commonly used. Recently, while scrutinizing the development of the RB formalism, we have found that these authors applied the isolated line approximation in their evaluating matrix elements of the Liouville scattering operator given in exponential form. Since the criterion of this assumption is so stringent, it is not valid for many systems of interest in atmospheric applications. Furthermore, it is this assumption that blocks the possibility to calculate the whole relaxation matrix at all. By eliminating this unjustified application, and accurately evaluating matrix elements of the exponential operators, we have developed a more capable formalism. With this new formalism, we are now able not only to reduce uncertainties for calculated half-widths and shifts, but also to remove a once insurmountable obstacle to calculate the whole relaxation matrix. This implies that we can address the line mixing with the semi-classical theory based on interaction potentials between molecular absorber and molecular perturber. We have applied this formalism to address the line mixing for Raman and infrared spectra of molecules such as N2, C2H2, CO2, NH3, and H2O. By carrying out rigorous calculations, our calculated relaxation matrices are in good agreement with both experimental data and results derived from the ECS model.
Le Deunff, Erwan; Malagoli, Philippe
2014-01-01
Background The top-down analysis of nitrate influx isotherms through the Enzyme-Substrate interpretation has not withstood recent molecular and histochemical analyses of nitrate transporters. Indeed, at least four families of nitrate transporters operating at both high and/or low external nitrate concentrations, and which are located in series and/or parallel in the different cellular layers of the mature root, are involved in nitrate uptake. Accordingly, the top-down analysis of the root catalytic structure for ion transport from the Enzyme-Substrate interpretation of nitrate influx isotherms is inadequate. Moreover, the use of the Enzyme-Substrate velocity equation as a single reference in agronomic models is not suitable in its formalism to account for variations in N uptake under fluctuating environmental conditions. Therefore, a conceptual paradigm shift is required to improve the mechanistic modelling of N uptake in agronomic models. Scope An alternative formalism, the Flow-Force theory, was proposed in the 1970s to describe ion isotherms based upon biophysical ‘flows and forces’ relationships of non-equilibrium thermodynamics. This interpretation describes, with macroscopic parameters, the patterns of N uptake provided by a biological system such as roots. In contrast to the Enzyme-Substrate interpretation, this approach does not claim to represent molecular characteristics. Here it is shown that it is possible to combine the Flow-Force formalism with polynomial responses of nitrate influx rate induced by climatic and in planta factors in relation to nitrate availability. Conclusions Application of the Flow-Force formalism allows nitrate uptake to be modelled in a more realistic manner, and allows scaling-up in time and space of the regulation of nitrate uptake across the plant growth cycle. PMID:25425406
Liu, Hao; Zhu, Lili; Bai, Shuming; Shi, Qiang
2014-04-07
We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly in the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Hao; Zhu, Lili; Bai, Shuming
2014-04-07
We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly inmore » the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.« less
Comprehensive Aspectual UML approach to support AspectJ.
Magableh, Aws; Shukur, Zarina; Ali, Noorazean Mohd
2014-01-01
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.
Comprehensive Aspectual UML Approach to Support AspectJ
Magableh, Aws; Shukur, Zarina; Mohd. Ali, Noorazean
2014-01-01
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a “good design” criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs. PMID:25136656
A UML-based metamodel for software evolution process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing
2014-04-01
A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.
Lynx conservation in an ecosystem management context [Chapter 15
Kevin S. McKelvey; Keith B. Aubry; James K. Agee; Steven W. Buskirk; Leonard F. Ruggiero; Gary M. Koehler
2000-01-01
In an ecosystem management context, management for lynx must occur in the context of the needs of other species, watershed health, and a variety of products, outputs, and uses. This chapter presents a management model based on the restoration of historical patterns and processes. We argue that this model is sustainable in a formal sense, practical, and likely...
Integrated Formal Analysis of Timed-Triggered Ethernet
NASA Technical Reports Server (NTRS)
Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam
2012-01-01
We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.
Predictor-Based Model Reference Adaptive Control
NASA Technical Reports Server (NTRS)
Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.
2009-01-01
This paper is devoted to robust, Predictor-based Model Reference Adaptive Control (PMRAC) design. The proposed adaptive system is compared with the now-classical Model Reference Adaptive Control (MRAC) architecture. Simulation examples are presented. Numerical evidence indicates that the proposed PMRAC tracking architecture has better than MRAC transient characteristics. In this paper, we presented a state-predictor based direct adaptive tracking design methodology for multi-input dynamical systems, with partially known dynamics. Efficiency of the design was demonstrated using short period dynamics of an aircraft. Formal proof of the reported PMRAC benefits constitute future research and will be reported elsewhere.
NASA Astrophysics Data System (ADS)
Hsieh, Chang-Yu; Cao, Jianshu
2018-01-01
We extend a standard stochastic theory to study open quantum systems coupled to a generic quantum environment. We exemplify the general framework by studying a two-level quantum system coupled bilinearly to the three fundamental classes of non-interacting particles: bosons, fermions, and spins. In this unified stochastic approach, the generalized stochastic Liouville equation (SLE) formally captures the exact quantum dissipations when noise variables with appropriate statistics for different bath models are applied. Anharmonic effects of a non-Gaussian bath are precisely encoded in the bath multi-time correlation functions that noise variables have to satisfy. Starting from the SLE, we devise a family of generalized hierarchical equations by averaging out the noise variables and expand bath multi-time correlation functions in a complete basis of orthonormal functions. The general hierarchical equations constitute systems of linear equations that provide numerically exact simulations of quantum dynamics. For bosonic bath models, our general hierarchical equation of motion reduces exactly to an extended version of hierarchical equation of motion which allows efficient simulation for arbitrary spectral densities and temperature regimes. Similar efficiency and flexibility can be achieved for the fermionic bath models within our formalism. The spin bath models can be simulated with two complementary approaches in the present formalism. (I) They can be viewed as an example of non-Gaussian bath models and be directly handled with the general hierarchical equation approach given their multi-time correlation functions. (II) Alternatively, each bath spin can be first mapped onto a pair of fermions and be treated as fermionic environments within the present formalism.
A Computational Model of Reasoning from the Clinical Literature
Rennels, Glenn D.
1986-01-01
This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.
Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.
Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie
2016-06-01
Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.
Hudson, Jessica L; Bell, Jennifer M; Crabtree, Traves D; Kreisel, Daniel; Patterson, G Alexander; Meyers, Bryan F; Puri, Varun
2018-01-01
Formal pulmonary function testing with laboratory spirometry (LS) is the standard of care for risk stratification before lung resection. LS and handheld office spirometry (OS) are clinically comparable for forced expiratory volume in 1 second and forced vital capacity. We investigated the safety of preoperative risk stratification based solely on OS. Patients at low-risk for cardiopulmonary complications were enrolled in a single-center prospective study and underwent preoperative OS. Formal LS was not performed when forced expiratory volume in 1 second was more than 60% by OS. Propensity score matching was used to compare patients in the OS group to low-risk institutional database patients (2008 to 2015) who underwent LS and lung resection. Standardized mean differences determined model covariate balance. The McNemar test and log-rank test were performed, respectively, for categorical and continuous paired outcome data. There were 66 prospectively enrolled patients who received OS and underwent pulmonary resection, and 1,290 patients received preoperative LS, resulting in 52 propensity score-matched pairs (83%). There were no deaths and two 30-day readmissions per group. The major morbidity risk was similar in each group (7.7%). All analyses of discordant pair morbidity had p exceeding 0.56. There was no association between length of stay and exposure to OS vs LS (p = 0.31). The estimated annual institutional cost savings from performing OS only and avoiding LS was $38,000. Low-risk patients undergoing lung resection can be adequately and safely assessed using OS without formal LS, with significant cost savings. With upcoming bundled care reimbursement paradigms, such safe and effective strategies are likely to be more widely used. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)
NASA Astrophysics Data System (ADS)
Bishop, M. P.; Houser, C.; Lemmons, K.
2015-12-01
Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.
ERIC Educational Resources Information Center
Chen, Huey T.; Turner, Nannette C.
2012-01-01
Health promotion and social betterment program interventions are based on either formal theory from academia or stakeholder theory from stakeholders' observations and experiences in working with clients. Over time, formal theory-based interventions have acquired high prestige, while stakeholder theory-based interventions have been held in low…
Statistical mechanics of the Huxley-Simmons model
NASA Astrophysics Data System (ADS)
Caruel, M.; Truskinovsky, L.
2016-06-01
The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duchemin, Ivan, E-mail: ivan.duchemin@cea.fr; Jacquemin, Denis; Institut Universitaire de France, 1 rue Descartes, 75005 Paris Cedex 5
We have implemented the polarizable continuum model within the framework of the many-body Green’s function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases ofmore » interest in organic optoelectronics, wet chemistry, and biology.« less
Masoudiyekta, Leila; Rezaei-Bayatiyani, Hojat; Dashtbozorgi, Bahman; Gheibizadeh, Mahin; Malehi, Amal Saki; Moradi, Mehrnaz
2018-01-01
The purpose of this study was to determine the effect of education based on health belief model (HBM) on the behavior of breast cancer screening (bCS) in women. This quasi-experimental study was conducted on 226 women who were selected with cluster sampling method from those referred to Dezful health centers. Data collection tool was a researcher-made questionnaire. Demographic questionnaire bCS- scale, and the Knowledge about questionnaire, all given before and 3 months after the intervention. According to the findings of the study, there was a significant relationship between women's performance and variables of knowledge, perceived sensitivity, perceived benefits, perceived barriers, self-efficacy, and cues to action. Poor knowledge of women indicates a crucial need for formal educational programs to sensitize women regarding the importance of bCS. These educational programs should consider factors affecting bCS behaviors.
ERIC Educational Resources Information Center
Arroyo, Rosa
2012-01-01
This work includes three models of retention for higher education. The study was performed at a private university, with 144 students of second semester second year. The theme is formal and informal relationships between students and their professors. The statics analysis was performed with SPSS 19 and STAT 12. [The dissertation citations…
Morphological computation and morphological control: steps toward a formal theory and applications.
Füchslin, Rudolf M; Dzyakanchuk, Andrej; Flumini, Dandolo; Hauser, Helmut; Hunt, Kenneth J; Luchsinger, Rolf H; Reller, Benedikt; Scheidegger, Stephan; Walker, Richard
2013-01-01
Morphological computation can be loosely defined as the exploitation of the shape, material properties, and physical dynamics of a physical system to improve the efficiency of a computation. Morphological control is the application of morphological computing to a control task. In its theoretical part, this article sharpens and extends these definitions by suggesting new formalized definitions and identifying areas in which the definitions we propose are still inadequate. We go on to describe three ongoing studies, in which we are applying morphological control to problems in medicine and in chemistry. The first involves an inflatable support system for patients with impaired movement, and is based on macroscopic physics and concepts already tested in robotics. The two other case studies (self-assembly of chemical microreactors; models of induced cell repair in radio-oncology) describe processes and devices on the micrometer scale, in which the emergent dynamics of the underlying physical system (e.g., phase transitions) are dominated by stochastic processes such as diffusion.
NASA Astrophysics Data System (ADS)
Haili, Hasnawati; Maknun, Johar; Siahaan, Parsaoran
2017-08-01
Physics is a lessons that related to students' daily experience. Therefore, before the students studying in class formally, actually they have already have a visualization and prior knowledge about natural phenomenon and could wide it themselves. The learning process in class should be aimed to detect, process, construct, and use students' mental model. So, students' mental model agree with and builds in the right concept. The previous study held in MAN 1 Muna informs that in learning process the teacher did not pay attention students' mental model. As a consequence, the learning process has not tried to build students' mental modelling ability (MMA). The purpose of this study is to describe the improvement of students' MMA as a effect of problem solving based learning model with multiple representations approach. This study is pre experimental design with one group pre post. It is conducted in XI IPA MAN 1 Muna 2016/2017. Data collection uses problem solving test concept the kinetic theory of gasses and interview to get students' MMA. The result of this study is clarification students' MMA which is categorized in 3 category; High Mental Modelling Ability (H-MMA) for 7
NASA Astrophysics Data System (ADS)
Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa
We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594
SHER: a colored petri net based random mobility model for wireless communications.
Khan, Naeem Akhtar; Ahmad, Farooq; Khan, Sher Afzal
2015-01-01
In wireless network research, simulation is the most imperative technique to investigate the network's behavior and validation. Wireless networks typically consist of mobile hosts; therefore, the degree of validation is influenced by the underlying mobility model, and synthetic models are implemented in simulators because real life traces are not widely available. In wireless communications, mobility is an integral part while the key role of a mobility model is to mimic the real life traveling patterns to study. The performance of routing protocols and mobility management strategies e.g. paging, registration and handoff is highly dependent to the selected mobility model. In this paper, we devise and evaluate the Show Home and Exclusive Regions (SHER), a novel two-dimensional (2-D) Colored Petri net (CPN) based formal random mobility model, which exhibits sociological behavior of a user. The model captures hotspots where a user frequently visits and spends time. Our solution eliminates six key issues of the random mobility models, i.e., sudden stops, memoryless movements, border effect, temporal dependency of velocity, pause time dependency, and speed decay in a single model. The proposed model is able to predict the future location of a mobile user and ultimately improves the performance of wireless communication networks. The model follows a uniform nodal distribution and is a mini simulator, which exhibits interesting mobility patterns. The model is also helpful to those who are not familiar with the formal modeling, and users can extract meaningful information with a single mouse-click. It is noteworthy that capturing dynamic mobility patterns through CPN is the most challenging and virulent activity of the presented research. Statistical and reachability analysis techniques are presented to elucidate and validate the performance of our proposed mobility model. The state space methods allow us to algorithmically derive the system behavior and rectify the errors of our proposed model.
SHER: A Colored Petri Net Based Random Mobility Model for Wireless Communications
Khan, Naeem Akhtar; Ahmad, Farooq; Khan, Sher Afzal
2015-01-01
In wireless network research, simulation is the most imperative technique to investigate the network’s behavior and validation. Wireless networks typically consist of mobile hosts; therefore, the degree of validation is influenced by the underlying mobility model, and synthetic models are implemented in simulators because real life traces are not widely available. In wireless communications, mobility is an integral part while the key role of a mobility model is to mimic the real life traveling patterns to study. The performance of routing protocols and mobility management strategies e.g. paging, registration and handoff is highly dependent to the selected mobility model. In this paper, we devise and evaluate the Show Home and Exclusive Regions (SHER), a novel two-dimensional (2-D) Colored Petri net (CPN) based formal random mobility model, which exhibits sociological behavior of a user. The model captures hotspots where a user frequently visits and spends time. Our solution eliminates six key issues of the random mobility models, i.e., sudden stops, memoryless movements, border effect, temporal dependency of velocity, pause time dependency, and speed decay in a single model. The proposed model is able to predict the future location of a mobile user and ultimately improves the performance of wireless communication networks. The model follows a uniform nodal distribution and is a mini simulator, which exhibits interesting mobility patterns. The model is also helpful to those who are not familiar with the formal modeling, and users can extract meaningful information with a single mouse-click. It is noteworthy that capturing dynamic mobility patterns through CPN is the most challenging and virulent activity of the presented research. Statistical and reachability analysis techniques are presented to elucidate and validate the performance of our proposed mobility model. The state space methods allow us to algorithmically derive the system behavior and rectify the errors of our proposed model. PMID:26267860
NASA Astrophysics Data System (ADS)
Abidi, Dhafer
TTEthernet is a deterministic network technology that makes enhancements to Layer 2 Quality-of-Service (QoS) for Ethernet. The components that implement its services enrich the Ethernet functionality with distributed fault-tolerant synchronization, robust temporal partitioning bandwidth and synchronous communication with fixed latency and low jitter. TTEthernet services can facilitate the design of scalable, robust, less complex distributed systems and architectures tolerant to faults. Simulation is nowadays an essential step in critical systems design process and represents a valuable support for validation and performance evaluation. CoRE4INET is a project bringing together all TTEthernet simulation models currently available. It is based on the extension of models of OMNeT ++ INET framework. Our objective is to study and simulate the TTEthernet protocol on a flight management subsystem (FMS). The idea is to use CoRE4INET to design the simulation model of the target system. The problem is that CoRE4INET does not offer a task scheduling tool for TTEthernet network. To overcome this problem we propose an adaptation for simulation purposes of a task scheduling approach based on formal specification of network constraints. The use of Yices solver allowed the translation of the formal specification into an executable program to generate the desired transmission plan. A case study allowed us at the end to assess the impact of the arrangement of Time-Triggered frames offsets on the performance of each type of the system traffic.
Collaborative Drug Therapy Management: Case Studies of Three Community-Based Models of Care
Snyder, Margie E.; Earl, Tara R.; Greenberg, Michael; Heisler, Holly; Revels, Michelle; Matson-Koffman, Dyann
2015-01-01
Collaborative drug therapy management agreements are a strategy for expanding the role of pharmacists in team-based care with other providers. However, these agreements have not been widely implemented. This study describes the features of existing provider–pharmacist collaborative drug therapy management practices and identifies the facilitators and barriers to implementing such services in community settings. We conducted in-depth, qualitative interviews in 2012 in a federally qualified health center, an independent pharmacy, and a retail pharmacy chain. Facilitators included 1) ensuring pharmacists were adequately trained; 2) obtaining stakeholder (eg, physician) buy-in; and 3) leveraging academic partners. Barriers included 1) lack of pharmacist compensation; 2) hesitation among providers to trust pharmacists; 3) lack of time and resources; and 4) existing informal collaborations that resulted in reduced interest in formal agreements. The models described in this study could be used to strengthen clinical–community linkages through team-based care, particularly for chronic disease prevention and management. PMID:25811494
Georg, Georg; Séroussi, Brigitte; Bouaud, Jacques
2003-01-01
The aim of this work was to determine whether the GEM-encoding step could improve the representation of clinical practice guidelines as formalized knowledge bases. We used the 1999 Canadian recommendations for the management of hypertension, chosen as the knowledge source in the ASTI project. We first clarified semantic ambiguities of therapeutic sequences recommended in the guideline by proposing an interpretative framework of therapeutic strategies. Then, after a formalization step to standardize the terms used to characterize clinical situations, we created the GEM-encoded instance of the guideline. We developed a module for the automatic derivation of a rule base, BR-GEM, from the instance. BR-GEM was then compared to the rule base, BR-ASTI, embedded within the critic mode of ASTI, and manually built by two physicians from the same Canadian guideline. As compared to BR-ASTI, BR-GEM is more specific and covers more clinical situations. When evaluated on 10 patient cases, the GEM-based approach led to promising results.
Gas solubility in dilute solutions: A novel molecular thermodynamic perspective
NASA Astrophysics Data System (ADS)
Chialvo, Ariel A.
2018-05-01
We present an explicit molecular-based interpretation of the thermodynamic phase equilibrium underlying gas solubility in liquids, through rigorous links between the microstructure of the dilute systems and the relevant macroscopic quantities that characterize their solution thermodynamics. We apply the formal analysis to unravel and highlight the molecular-level nature of the approximations behind the widely used Krichevsky-Kasarnovsky [J. Am. Chem. Soc. 57, 2168 (1935)] and Krichevsky-Ilinskaya [Acta Physicochim. 20, 327 (1945)] equations for the modeling of gas solubility. Then, we implement a general molecular-based approach to gas solubility and illustrate it by studying Lennard-Jones binary systems whose microstructure and thermodynamic properties were consistently generated via integral equation calculations. Furthermore, guided by the molecular-based analysis, we propose a novel macroscopic modeling approach to gas solubility, emphasize some usually overlook modeling subtleties, and identify novel interdependences among relevant solubility quantities that can be used as either handy modeling constraints or tools for consistency tests.
Gas solubility in dilute solutions: A novel molecular thermodynamic perspective.
Chialvo, Ariel A
2018-05-07
We present an explicit molecular-based interpretation of the thermodynamic phase equilibrium underlying gas solubility in liquids, through rigorous links between the microstructure of the dilute systems and the relevant macroscopic quantities that characterize their solution thermodynamics. We apply the formal analysis to unravel and highlight the molecular-level nature of the approximations behind the widely used Krichevsky-Kasarnovsky [J. Am. Chem. Soc. 57, 2168 (1935)] and Krichevsky-Ilinskaya [Acta Physicochim. 20, 327 (1945)] equations for the modeling of gas solubility. Then, we implement a general molecular-based approach to gas solubility and illustrate it by studying Lennard-Jones binary systems whose microstructure and thermodynamic properties were consistently generated via integral equation calculations. Furthermore, guided by the molecular-based analysis, we propose a novel macroscopic modeling approach to gas solubility, emphasize some usually overlook modeling subtleties, and identify novel interdependences among relevant solubility quantities that can be used as either handy modeling constraints or tools for consistency tests.
Connecting Formal and Informal Learning Experiences
ERIC Educational Resources Information Center
O'Mahony, Timothy Kieran
2010-01-01
The learning study reports on part of a larger project being lead by the author. In this dissertation I explore one goal of this project--to understand effects on student learning outcomes as a function of using different methods for connecting out-of-school experiential learning with formal school-based instruction. There is a long history of…
The Factors That Influence an Entrepreneur's Decision to Seek Formal Education
ERIC Educational Resources Information Center
Griffith, Ralph
2013-01-01
The purpose of this study was to understand the factors that lead entrepreneurs to seek formal education. The evidence shows that entrepreneurs are playing an increasingly important role in producing economic growth in the United States. This researcher examined the theoretical foundations of entrepreneurship based on economic, psychological, and…