Sample records for formal process models

  1. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  2. Reciprocal relations between cognitive neuroscience and formal cognitive models: opposites attract?

    PubMed

    Forstmann, Birte U; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T

    2011-06-01

    Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal cognitive models can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of formal cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent; not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide key insights into formal models of cognition. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes

    NASA Astrophysics Data System (ADS)

    Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria

    In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.

  4. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  5. Formal Specification of Information Systems Requirements.

    ERIC Educational Resources Information Center

    Kampfner, Roberto R.

    1985-01-01

    Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)

  6. Industry Strength Tool and Technology for Automated Synthesis of Safety-Critical Applications from Formal Specifications

    DTIC Science & Technology

    2015-11-01

    28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal

  7. Managing Risk in Mobile Applications with Formal Security Policies

    DTIC Science & Technology

    2013-04-01

    Alternatively, Breaux and Powers (2009) found the Business Process Modeling Notation ( BPMN ), a declarative language for describing business processes, to be...the Business Process Execution Language (BPEL), preferred as the candidate formal semantics for BPMN , only works for limited classes of BPMN models

  8. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  9. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  10. A Model-Driven Approach to Teaching Concurrency

    ERIC Educational Resources Information Center

    Carro, Manuel; Herranz, Angel; Marino, Julio

    2013-01-01

    We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…

  11. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  12. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  13. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  14. Genetic programming assisted stochastic optimization strategies for optimization of glucose to gluconic acid fermentation.

    PubMed

    Cheema, Jitender Jit Singh; Sankpal, Narendra V; Tambe, Sanjeev S; Kulkarni, Bhaskar D

    2002-01-01

    This article presents two hybrid strategies for the modeling and optimization of the glucose to gluconic acid batch bioprocess. In the hybrid approaches, first a novel artificial intelligence formalism, namely, genetic programming (GP), is used to develop a process model solely from the historic process input-output data. In the next step, the input space of the GP-based model, representing process operating conditions, is optimized using two stochastic optimization (SO) formalisms, viz., genetic algorithms (GAs) and simultaneous perturbation stochastic approximation (SPSA). These SO formalisms possess certain unique advantages over the commonly used gradient-based optimization techniques. The principal advantage of the GP-GA and GP-SPSA hybrid techniques is that process modeling and optimization can be performed exclusively from the process input-output data without invoking the detailed knowledge of the process phenomenology. The GP-GA and GP-SPSA techniques have been employed for modeling and optimization of the glucose to gluconic acid bioprocess, and the optimized process operating conditions obtained thereby have been compared with those obtained using two other hybrid modeling-optimization paradigms integrating artificial neural networks (ANNs) and GA/SPSA formalisms. Finally, the overall optimized operating conditions given by the GP-GA method, when verified experimentally resulted in a significant improvement in the gluconic acid yield. The hybrid strategies presented here are generic in nature and can be employed for modeling and optimization of a wide variety of batch and continuous bioprocesses.

  15. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  16. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  17. Guideline validation in multiple trauma care through business process modeling.

    PubMed

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  18. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  19. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    NASA Technical Reports Server (NTRS)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  20. End users transforming experiences into formal information and process models for personalised health interventions.

    PubMed

    Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene

    2014-01-01

    Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.

  1. The Markov process admits a consistent steady-state thermodynamic formalism

    NASA Astrophysics Data System (ADS)

    Peng, Liangrong; Zhu, Yi; Hong, Liu

    2018-01-01

    The search for a unified formulation for describing various non-equilibrium processes is a central task of modern non-equilibrium thermodynamics. In this paper, a novel steady-state thermodynamic formalism was established for general Markov processes described by the Chapman-Kolmogorov equation. Furthermore, corresponding formalisms of steady-state thermodynamics for the master equation and Fokker-Planck equation could be rigorously derived in mathematics. To be concrete, we proved that (1) in the limit of continuous time, the steady-state thermodynamic formalism for the Chapman-Kolmogorov equation fully agrees with that for the master equation; (2) a similar one-to-one correspondence could be established rigorously between the master equation and Fokker-Planck equation in the limit of large system size; (3) when a Markov process is restrained to one-step jump, the steady-state thermodynamic formalism for the Fokker-Planck equation with discrete state variables also goes to that for master equations, as the discretization step gets smaller and smaller. Our analysis indicated that general Markov processes admit a unified and self-consistent non-equilibrium steady-state thermodynamic formalism, regardless of underlying detailed models.

  2. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  3. Reciprocal Relations Between Cognitive Neuroscience and Cognitive Models: Opposites Attract?

    PubMed Central

    Forstmann, Birte U.; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T.

    2012-01-01

    Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal models of cognition can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent: not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide critical insights into formal models of cognition. PMID:21612972

  4. Formal Process Modeling to Improve Human Decision-Making in Test and Evaluation Acoustic Range Control

    DTIC Science & Technology

    2017-09-01

    AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Test and...ambiguities and identify high -value decision points? This thesis explores how formalization of these experience-based decisions as a process model...representing a T&E event may reveal high -value decision nodes where certain decisions carry more weight or potential for impacts to a successful test. The

  5. Longitudinal Associations Between Formal Volunteering and Cognitive Functioning.

    PubMed

    Proulx, Christine M; Curl, Angela L; Ermer, Ashley E

    2018-03-02

    The present study examines the association between formal volunteering and cognitive functioning over time. We also examine the moderating roles of race, sex, education, and time. Using 11,100 participants aged 51 years and older and nine waves of data from the Health and Retirement Survey, we simultaneously modeled the longitudinal associations between engaging in formal volunteering and changes in cognitive functioning using multilevel models. Formal volunteering was associated with higher levels of cognitive functioning over time, especially with aspects of cognitive functioning related to working memory and processing. This association was stronger for women than it was for men, and for those with below average levels of education. The positive association between formal volunteering and cognitive functioning weakened over time when cognitive functioning was conceptualized as memory, but strengthened over time when conceptualized as working memory and processing. Volunteering is a productive activity that is beneficial not just to society, but to volunteers' levels of cognitive functioning in older age. For women and those with lower levels of education, formal volunteering appears particularly beneficial to working memory and processing. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    PubMed

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE's understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.

  7. LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.

    PubMed

    Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat

    2009-08-01

    To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.

  8. Behavior Models for Software Architecture

    DTIC Science & Technology

    2014-11-01

    MP. Existing process modeling frameworks (BPEL, BPMN [Grosskopf et al. 2009], IDEF) usually follow the “single flowchart” paradigm. MP separates...Process: Business Process Modeling using BPMN , Meghan Kiffer Press. HAREL, D., 1987, A Visual Formalism for Complex Systems. Science of Computer

  9. IDEF3 Formalization Report

    DTIC Science & Technology

    1991-10-01

    SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department

  10. The cognitive processes underlying event-based prospective memory in school-age children and young adults: a formal model-based study.

    PubMed

    Smith, Rebekah E; Bayen, Ute J; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory performance. The formal modeling results demonstrate that adults differed significantly from the 7-year-olds and the 10-year-olds on both the prospective component and the retrospective component of the task. The 7-year-olds and the 10-year-olds differed only in the ability to recognize prospective memory target events. The prospective memory task imposed a cost to ongoing activities in all 3 age groups. Copyright 2009 APA, all rights reserved.

  11. Formal Definition of Measures for BPMN Models

    NASA Astrophysics Data System (ADS)

    Reynoso, Luis; Rolón, Elvira; Genero, Marcela; García, Félix; Ruiz, Francisco; Piattini, Mario

    Business process models are currently attaining more relevance, and more attention is therefore being paid to their quality. This situation led us to define a set of measures for the understandability of BPMN models, which is shown in a previous work. We focus on understandability since a model must be well understood before any changes are made to it. These measures were originally informally defined in natural language. As is well known, natural language is ambiguous and may lead to misunderstandings and a misinterpretation of the concepts captured by a measure and the way in which the measure value is obtained. This has motivated us to provide the formal definition of the proposed measures using OCL (Object Constraint Language) upon the BPMN (Business Process Modeling Notation) metamodel presented in this paper. The main advantages and lessons learned (which were obtained both from the current work and from previous works carried out in relation to the formal definition of other measures) are also summarized.

  12. The VATES-Diamond as a Verifier's Best Friend

    NASA Astrophysics Data System (ADS)

    Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz

    Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.

  13. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  14. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  15. Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B

    NASA Technical Reports Server (NTRS)

    Yeganefard, Sanaz; Butler, Michael; Rezazadeh, Abdolbaghi

    2010-01-01

    Recently a set of guidelines, or cookbook, has been developed for modelling and refinement of control problems in Event-B. The Event-B formal method is used for system-level modelling by defining states of a system and events which act on these states. It also supports refinement of models. This cookbook is intended to systematize the process of modelling and refining a control problem system by distinguishing environment, controller and command phenomena. Our main objective in this paper is to investigate and evaluate the usefulness and effectiveness of this cookbook by following it throughout the formal modelling of cruise control system found in cars. The outcomes are identifying the benefits of the cookbook and also giving guidance to its future users.

  16. New method of contour image processing based on the formalism of spiral light beams

    NASA Astrophysics Data System (ADS)

    Volostnikov, Vladimir G.; Kishkin, S. A.; Kotova, S. P.

    2013-07-01

    The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented.

  17. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  18. The Stochastic Early Reaction, Inhibition, and late Action (SERIA) model for antisaccades

    PubMed Central

    2017-01-01

    The antisaccade task is a classic paradigm used to study the voluntary control of eye movements. It requires participants to suppress a reactive eye movement to a visual target and to concurrently initiate a saccade in the opposite direction. Although several models have been proposed to explain error rates and reaction times in this task, no formal model comparison has yet been performed. Here, we describe a Bayesian modeling approach to the antisaccade task that allows us to formally compare different models on the basis of their evidence. First, we provide a formal likelihood function of actions (pro- and antisaccades) and reaction times based on previously published models. Second, we introduce the Stochastic Early Reaction, Inhibition, and late Action model (SERIA), a novel model postulating two different mechanisms that interact in the antisaccade task: an early GO/NO-GO race decision process and a late GO/GO decision process. Third, we apply these models to a data set from an experiment with three mixed blocks of pro- and antisaccade trials. Bayesian model comparison demonstrates that the SERIA model explains the data better than competing models that do not incorporate a late decision process. Moreover, we show that the early decision process postulated by the SERIA model is, to a large extent, insensitive to the cue presented in a single trial. Finally, we use parameter estimates to demonstrate that changes in reaction time and error rate due to the probability of a trial type (pro- or antisaccade) are best explained by faster or slower inhibition and the probability of generating late voluntary prosaccades. PMID:28767650

  19. Line Mixing in Parallel and Perpendicular Bands of CO2: A Further Test of the Refined Robert-Bonamy Formalism

    NASA Technical Reports Server (NTRS)

    Boulet, C.; Ma, Qiancheng; Tipping, R. H.

    2015-01-01

    Starting from the refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)], we propose here an extension of line mixing studies to infrared absorptions of linear polyatomic molecules having stretching and bending modes. The present formalism does not neglect the internal degrees of freedom of the perturbing molecules, contrary to the energy corrected sudden (ECS) modeling, and enables one to calculate the whole relaxation matrix starting from the potential energy surface. Meanwhile, similar to the ECS modeling, the present formalism properly accounts for roles played by all the internal angular momenta in the coupling process, including the vibrational angular momentum. The formalism has been applied to the important case of CO2 broadened by N2. Applications to two kinds of vibrational bands (sigma yields sigma and sigma yields pi) have shown that the present results are in good agreement with both experimental data and results derived from the ECS model.

  20. Towards Using Reo for Compliance-Aware Business Process Modeling

    NASA Astrophysics Data System (ADS)

    Arbab, Farhad; Kokash, Natallia; Meng, Sun

    Business process modeling and implementation of process supporting infrastructures are two challenging tasks that are not fully aligned. On the one hand, languages such as Business Process Modeling Notation (BPMN) exist to capture business processes at the level of domain analysis. On the other hand, programming paradigms and technologies such as Service-Oriented Computing (SOC) and web services have emerged to simplify the development of distributed web systems that underly business processes. BPMN is the most recognized language for specifying process workflows at the early design steps. However, it is rather declarative and may lead to the executable models which are incomplete or semantically erroneous. Therefore, an approach for expressing and analyzing BPMN models in a formal setting is required. In this paper we describe how BPMN diagrams can be represented by means of a semantically precise channel-based coordination language called Reo which admits formal analysis using model checking and bisimulation techniques. Moreover, since additional requirements may come from various regulatory/legislative documents, we discuss the opportunities offered by Reo and its mathematical abstractions for expressing process-related constraints such as Quality of Service (QoS) or time-aware conditions on process states.

  1. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  2. The (Mathematical) Modeling Process in Biosciences.

    PubMed

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  3. Consulting Basics for the Teacher-Turned-Technology Consultant.

    ERIC Educational Resources Information Center

    Stager, Sue; Green, Kathy

    1988-01-01

    Discusses the role of educational technology consultants who may be classroom teachers with no formal training in consulting. Consulting models are described, including content-oriented and process-oriented approaches; Schein's process facilitator model is examined; and Kurpius' consulting model is explained and expanded. (LRW)

  4. Integrated Modeling and Simulation Verification, Validation, and Accreditation Strategy for Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    2006-01-01

    Models and simulations (M&S) are critical resources in the exploration of space. They support program management, systems engineering, integration, analysis, test, and operations and provide critical information and data supporting key analyses and decisions (technical, cost and schedule). Consequently, there is a clear need to establish a solid understanding of M&S strengths and weaknesses, and the bounds within which they can credibly support decision-making. Their usage requires the implementation of a rigorous approach to verification, validation and accreditation (W&A) and establishment of formal process and practices associated with their application. To ensure decision-making is suitably supported by information (data, models, test beds) from activities (studies, exercises) from M&S applications that are understood and characterized, ESMD is establishing formal, tailored W&A processes and practices. In addition, to ensure the successful application of M&S within ESMD, a formal process for the certification of analysts that use M&S is being implemented. This presentation will highlight NASA's Exploration Systems Mission Directorate (ESMD) management approach for M&S W&A to ensure decision-makers receive timely information on the model's fidelity, credibility, and quality.

  5. The Many Perspectives of Valuing Learning

    ERIC Educational Resources Information Center

    Duvekot, Ruud

    2009-01-01

    Valuing Learning is the process of promoting participation in and outcomes of (formal or non-formal) learning and as such the organising principle for lifelong learning strategies. It aims at the recognition and validation of prior learning (VPL) and further development. Four main models of Valuing Learning can be distinguished: (1) the…

  6. A UML-based metamodel for software evolution process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing

    2014-04-01

    A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.

  7. Use of Statechart Assertions for Modeling Human-in-the-Loop Security Analysis and Decision-Making Processes

    DTIC Science & Technology

    2012-06-01

    THIS PAGE INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS BPM Business Process Model BPMN Business Process Modeling Notation C&A...checking leads to an improvement in the quality and success of enterprise software development. Business Process Modeling Notation ( BPMN ) is an...emerging standard that allows business processes to be captured in a standardized format. BPMN lacks formal semantics which leaves many of its features

  8. A systematic approach to embedded biomedical decision making.

    PubMed

    Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver

    2012-11-01

    An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Equilibrium Free Energies from Nonequilibrium Metadynamics

    NASA Astrophysics Data System (ADS)

    Bussi, Giovanni; Laio, Alessandro; Parrinello, Michele

    2006-03-01

    In this Letter we propose a new formalism to map history-dependent metadynamics in a Markovian process. We apply this formalism to model Langevin dynamics and determine the equilibrium distribution of a collection of simulations. We demonstrate that the reconstructed free energy is an unbiased estimate of the underlying free energy and analytically derive an expression for the error. The present results can be applied to other history-dependent stochastic processes, such as Wang-Landau sampling.

  10. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  11. A Formal Valuation Framework for Emotions and Their Control.

    PubMed

    Huys, Quentin J M; Renz, Daniel

    2017-09-15

    Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  12. Representative Model of the Learning Process in Virtual Spaces Supported by ICT

    ERIC Educational Resources Information Center

    Capacho, José

    2014-01-01

    This paper shows the results of research activities for building the representative model of the learning process in virtual spaces (e-Learning). The formal basis of the model are supported in the analysis of models of learning assessment in virtual spaces and specifically in Dembo´s teaching learning model, the systemic approach to evaluating…

  13. Basic Processes and Instructional Practices in Teaching Reading. Reading Education Report No. 7.

    ERIC Educational Resources Information Center

    Pearson, P. David; Kamil, Michael L.

    Informal reading models, although more like metaphors than truly scientific models, may be just as useful in making instructional decisions as formal models are in physical science. Models are a vital part of the instructional process even when teachers are not consciously aware of their presence. Three classes of reading models are bottom-up…

  14. Causal tapestries for psychology and physics.

    PubMed

    Sulis, William H

    2012-04-01

    Archetypal dynamics is a formal approach to the modeling of information flow in complex systems used to study emergence. It is grounded in the Fundamental Triad of realisation (system), interpretation (archetype) and representation (formal model). Tapestries play a fundamental role in the framework of archetypal dynamics as a formal representational system. They represent information flow by means of multi layered, recursive, interlinked graphical structures that express both geometry (form or sign) and logic (semantics). This paper presents a detailed mathematical description of a specific tapestry model, the causal tapestry, selected for use in describing behaving systems such as appear in psychology and physics from the standpoint of Process Theory. Causal tapestries express an explicit Lorentz invariant transient now generated by means of a reality game. Observables are represented by tapestry informons while subjective or hidden components (for example intellectual and emotional processes) are incorporated into the reality game that determines the tapestry dynamics. As a specific example, we formulate a random graphical dynamical system using causal tapestries.

  15. Formal Semantics and Implementation of BPMN 2.0 Inclusive Gateways

    NASA Astrophysics Data System (ADS)

    Christiansen, David Raymond; Carbone, Marco; Hildebrandt, Thomas

    We present the first direct formalization of the semantics of inclusive gateways as described in the Business Process Modeling Notation (BPMN) 2.0 Beta 1 specification. The formal semantics is given for a minimal subset of BPMN 2.0 containing just the inclusive and exclusive gateways and the start and stop events. By focusing on this subset we achieve a simple graph model that highlights the particular non-local features of the inclusive gateway semantics. We sketch two ways of implementing the semantics using algorithms based on incrementally updated data structures and also discuss distributed communication-based implementations of the two algorithms.

  16. Global How?--Linking Practice to Theory: A Competency Model for Training Global Learning Facilitators

    ERIC Educational Resources Information Center

    Büker, Gundula; Schell-Straub, Sigrid

    2017-01-01

    Global learning facilitators from civil society organizations (CSOs) design and enrich educational processes in formal and non-formal educational settings. They need to be empowered through adequate training opportunities in global learning (GL) contexts. The project Facilitating Global Learning--Key Competences from Members of European CSOs (FGL)…

  17. Indicators of Informal and Formal Decision-Making about a Socioscientific Issue

    ERIC Educational Resources Information Center

    Dauer, Jenny M.; Lute, Michelle L.; Straka, Olivia

    2017-01-01

    We propose two contrasting types of student decision-making based on social and cognitive psychology models of separate mental processes for problem solving. Informal decision-making uses intuitive reasoning and is subject to cognitive biases, whereas formal decision-making uses effortful, logical reasoning. We explored indicators of students'…

  18. Evaluation of the mathematical and economic basis for conversion processes in the LEAP energy-economy model

    NASA Astrophysics Data System (ADS)

    Oblow, E. M.

    1982-10-01

    An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.

  19. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  1. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  2. Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web

    NASA Astrophysics Data System (ADS)

    Huang, Hong; Gong, Jianya

    2008-12-01

    GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.

  3. Cyto-Sim: a formal language model and stochastic simulator of membrane-enclosed biochemical processes.

    PubMed

    Sedwards, Sean; Mazza, Tommaso

    2007-10-15

    Compartments and membranes are the basis of cell topology and more than 30% of the human genome codes for membrane proteins. While it is possible to represent compartments and membrane proteins in a nominal way with many mathematical formalisms used in systems biology, few, if any, explicitly model the topology of the membranes themselves. Discrete stochastic simulation potentially offers the most accurate representation of cell dynamics. Since the details of every molecular interaction in a pathway are often not known, the relationship between chemical species in not necessarily best described at the lowest level, i.e. by mass action. Simulation is a form of computer-aided analysis, relying on human interpretation to derive meaning. To improve efficiency and gain meaning in an automatic way, it is necessary to have a formalism based on a model which has decidable properties. We present Cyto-Sim, a stochastic simulator of membrane-enclosed hierarchies of biochemical processes, where the membranes comprise an inner, outer and integral layer. The underlying model is based on formal language theory and has been shown to have decidable properties (Cavaliere and Sedwards, 2006), allowing formal analysis in addition to simulation. The simulator provides variable levels of abstraction via arbitrary chemical kinetics which link to ordinary differential equations. In addition to its compact native syntax, Cyto-Sim currently supports models described as Petri nets, can import all versions of SBML and can export SBML and MATLAB m-files. Cyto-Sim is available free, either as an applet or a stand-alone Java program via the web page (http://www.cosbi.eu/Rpty_Soft_CytoSim.php). Other versions can be made available upon request.

  4. A Multinomial Model of Event-Based Prospective Memory

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.

    2004-01-01

    Prospective memory is remembering to perform an action in the future. The authors introduce the 1st formal model of event-based prospective memory, namely, a multinomial model that includes 2 separate parameters related to prospective memory processes. The 1st measures preparatory attentional processes, and the 2nd measures retrospective memory…

  5. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  6. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  7. Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.

    2017-12-01

    The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.

  8. "Taking Charge of One's Life": A Model for Weight Management Success

    ERIC Educational Resources Information Center

    Adams, Marlene

    2008-01-01

    Obesity is a serious, prevalent, and refractory disorder that increases with age particularly in women who enroll in formal weight loss treatments. This study examined the processes used by obese postmenopausal women as they participated in a formal weight loss program. Using grounded theory, interviews were conducted with 14 women engaged in a…

  9. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  10. A model of sexually and physically victimized women's process of attaining effective formal help over time: the role of social location, context, and intervention.

    PubMed

    Kennedy, Angie C; Adams, Adrienne; Bybee, Deborah; Campbell, Rebecca; Kubiak, Sheryl Pimlott; Sullivan, Cris

    2012-09-01

    As empirical evidence has demonstrated the pervasiveness of sexual assault and intimate partner violence in the lives of women, and the links to poor mental health outcomes, attention has turned to examining how women seek and access formal help. We present a conceptual model that addresses prior limitations and makes three key contributions: It foregrounds the influence of social location and multiple contextual factors; emphasizes the importance of the attainment of effective formal help that meets women's needs and leads to positive mental health outcomes; and highlights the role of interventions in facilitating help attainment. We conclude with research and practice implications.

  11. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  12. Application of Non-Kolmogorovian Probability and Quantum Adaptive Dynamics to Unconscious Inference in Visual Perception Process

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2016-07-01

    Recently a novel quantum information formalism — quantum adaptive dynamics — was developed and applied to modelling of information processing by bio-systems including cognitive phenomena: from molecular biology (glucose-lactose metabolism for E.coli bacteria, epigenetic evolution) to cognition, psychology. From the foundational point of view quantum adaptive dynamics describes mutual adapting of the information states of two interacting systems (physical or biological) as well as adapting of co-observations performed by the systems. In this paper we apply this formalism to model unconscious inference: the process of transition from sensation to perception. The paper combines theory and experiment. Statistical data collected in an experimental study on recognition of a particular ambiguous figure, the Schröder stairs, support the viability of the quantum(-like) model of unconscious inference including modelling of biases generated by rotation-contexts. From the probabilistic point of view, we study (for concrete experimental data) the problem of contextuality of probability, its dependence on experimental contexts. Mathematically contextuality leads to non-Komogorovness: probability distributions generated by various rotation contexts cannot be treated in the Kolmogorovian framework. At the same time they can be embedded in a “big Kolmogorov space” as conditional probabilities. However, such a Kolmogorov space has too complex structure and the operational quantum formalism in the form of quantum adaptive dynamics simplifies the modelling essentially.

  13. Exploring the Process of Implementing Healthy Workplace Initiatives: Mapping to Kotter's Leading Change Model.

    PubMed

    Chappell, Stacie; Pescud, Melanie; Waterworth, Pippa; Shilton, Trevor; Roche, Dee; Ledger, Melissa; Slevin, Terry; Rosenberg, Michael

    2016-10-01

    The aim of this study was to use Kotter's leading change model to explore the implementation of workplace health and wellbeing initiatives. Qualitative interviews were conducted with 31 workplace representatives with a healthy workplace initiative. None of the workplaces used a formal change management model when implementing their healthy workplace initiatives. Not all of the steps in Kotter model were considered necessary and the order of the steps was challenged. For example, interviewees perceived that communicating the vision, developing the vision, and creating a guiding coalition were integral parts of the process, although there was less emphasis on the importance of creating a sense of urgency and consolidating change. Although none of the workplaces reported using a formal organizational change model when implementing their healthy workplace initiatives, there did appear to be perceived merit in using the steps in Kotter's model.

  14. Influencing Self-Reported Health among Rural Low-Income Women through Health Care and Social Service Utilization: A Structural Equation Model

    ERIC Educational Resources Information Center

    Bice-Wigington, Tiffany; Huddleston-Casas, Catherine

    2012-01-01

    Using structural equation modeling, this study examined the mesosystemic processes among rural low-income women, and how these processes subsequently influenced self-reported health. Acknowledging the behavioral processes inherent in utilization of health care and formal social support services, this study moved beyond a behavioral focus by…

  15. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  16. Towards Compensation Correctness in Interactive Systems

    NASA Astrophysics Data System (ADS)

    Vaz, Cátia; Ferreira, Carla

    One fundamental idea of service-oriented computing is that applications should be developed by composing already available services. Due to the long running nature of service interactions, a main challenge in service composition is ensuring correctness of failure recovery. In this paper, we use a process calculus suitable for modelling long running transactions with a recovery mechanism based on compensations. Within this setting, we discuss and formally state correctness criteria for compensable processes compositions, assuming that each process is correct with respect to failure recovery. Under our theory, we formally interpret self-healing compositions, that can detect and recover from failures, as correct compositions of compensable processes.

  17. Gsflow-py: An integrated hydrologic model development tool

    NASA Astrophysics Data System (ADS)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  18. Computing biological functions using BioΨ, a formal description of biological processes based on elementary bricks of actions

    PubMed Central

    Pérès, Sabine; Felicori, Liza; Rialle, Stéphanie; Jobard, Elodie; Molina, Franck

    2010-01-01

    Motivation: In the available databases, biological processes are described from molecular and cellular points of view, but these descriptions are represented with text annotations that make it difficult to handle them for computation. Consequently, there is an obvious need for formal descriptions of biological processes. Results: We present a formalism that uses the BioΨ concepts to model biological processes from molecular details to networks. This computational approach, based on elementary bricks of actions, allows us to calculate on biological functions (e.g. process comparison, mapping structure–function relationships, etc.). We illustrate its application with two examples: the functional comparison of proteases and the functional description of the glycolysis network. This computational approach is compatible with detailed biological knowledge and can be applied to different kinds of systems of simulation. Availability: www.sysdiag.cnrs.fr/publications/supplementary-materials/BioPsi_Manager/ Contact: sabine.peres@sysdiag.cnrs.fr; franck.molina@sysdiag.cnrs.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20448138

  19. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  20. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  1. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  2. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  3. Formalized landscape models for surveying and modelling tasks

    NASA Astrophysics Data System (ADS)

    Löwner, Marc-Oliver

    2010-05-01

    We present a formalization of main geomorphic landscape models, mainly the concept of slopes, to clarify the needs and potentials of surveying technologies and modelling approaches. Using the Unified Modelling Language (UML) it is implemented as a exchangeable Geography Markup Language (GML3) -based application schema and therefore supports shared measurement campaigns. Today, knowledge in Geomorphology is given synoptically in textbooks in a more or less lyrical way. This knowledge is hard to implement for the use of modelling algorithms or data storage and sharing questions. On the other hand physical based numerical modelling and high resolution surveying technologies enable us to investigate case scenarios within small scales. Bringing together such approaches and organizing our data in an appropriate way will need the formalization of the concepts and knowledge that is archived in the science of geomorphology. The main problem of comparing research results in geomorphology but is that the objects under investigation are composed of 3-dimensional geometries that change in time due to processes of material fluxes, e. g. soil erosion or mass movements. They have internal properties, e. g. soil texture or bulk density, that determine the effectiveness of these processes but are under change as well. The presented application schema is available on the Internet and therefore a first step to enable researchers to share information using an OGC's Web feature service. In this vein comparing modelling results of landscape evolution with results of other scientist's observations is possible. Compared to prevalent data concepts the model presented makes it possible to store information about landforms, their geometry and the characteristics in more detail. It allows to represent the 3D-geometry, the set of material properties and the genesis of a landform by associating processes to a geoobject. Thus, time slices of a geomorphic system can be represented as well as scenarios of landscape modelling. Commercial GI-software is not adapted to the needs of the science of geomorphology. Therefore the development of an application model i. e. a formal description of semantics is imperative to partake in technologies like Web Feature Services supporting interoperable data transfer.

  4. Adaptive harvest management of North American waterfowl populations - recent successes and future prospects

    USGS Publications Warehouse

    Nichols, J.D.; Runge, M.C.; Johnson, F.A.; Williams, B.K.; Schodde, Richard; Hannon, Susan; Scheiffarth, Gregor; Bairlein, Franz

    2006-01-01

    The history of North American waterfowl harvest management has been characterized by attempts to use population monitoring data to make informed harvest management decisions. Early attempts can be characterized as intuitive decision processes, and later efforts were guided increasingly by population models and associated predictions. In 1995, a formal adaptive management process was implemented, and annual decisions about duck harvest regulations in the United States are still based on this process. This formal decision process is designed to deal appropriately with the various forms of uncertainty that characterize management decisions, environmental uncertainty, structural uncertainty, partial controllability and partial observability. The key components of the process are (1) objectives, (2) potential management actions, (3) model(s) of population response to management actions, (4) credibility measures for these models, and (5) a monitoring program. The operation of this iterative process is described, and a brief history of a decade of its use is presented. Future challenges range from social and political issues such as appropriate objectives and management actions, to technical issues such as multispecies management, geographic allocation of harvest, and incorporation of actions that include habitat acquisition and management.

  5. Formalization of the Access Control on ARM-Android Platform with the B Method

    NASA Astrophysics Data System (ADS)

    Ren, Lu; Wang, Wei; Zhu, Xiaodong; Man, Yujia; Yin, Qing

    2018-01-01

    ARM-Android is a widespread mobile platform with multi-layer access control mechanisms, security-critical in the system. Many access control vulnerabilities still exist due to the course-grained policy and numerous engineering defects, which have been widely studied. However, few researches focus on the mechanism formalization, including the Android permission framework, kernel process management and hardware isolation. This paper first develops a comprehensive formal access control model on the ARM-Android platform using the B method, from the Android middleware to hardware layer. All the model specifications are type checked and proved to be well-defined, with 75%of proof obligations demonstrated automatically. The results show that the proposed B model is feasible to specify and verify access control schemes in the ARM-Android system, and capable of implementing a practical control module.

  6. Mechanisms of Developmental Change in Infant Categorization

    ERIC Educational Resources Information Center

    Westermann, Gert; Mareschal, Denis

    2012-01-01

    Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…

  7. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  8. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  9. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  10. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  11. A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Grando, M Adela; Glasspool, David; Fox, John

    2012-01-01

    To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  13. Understanding the Influence of the Complex Relationships among Informal and Formal Supports on the Well-Being of Caregivers of Persons with Dementia

    ERIC Educational Resources Information Center

    Raina, Parminder; McIntyre, Chris; Zhu, Bin; McDowell, Ian; Santaguida, Pasqualina; Kristjansson, Betsy; Hendricks, Alexandra; Massfeller, Helen; Chambers, Larry

    2004-01-01

    This study examined the direct and indirect relationships between caring for a person with dementia and caregiver health. A conceptual model of the caregiver stress process considered informal caregiver characteristics, sources of caregiver stress, and the influence of informal and formal support on the well-being of the caregivers of persons with…

  14. A New View of Radiation-Induced Cancer: Integrating Short-and Long-Term Processes. Part I: Approach

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Hahnfeldt, Philip; Hlatky, Lynn; Sachs, Rainer K.; Brenner, David J.

    2009-01-01

    Mathematical models of radiation carcinogenesis are important for understanding mechanisms and for interpreting or extrapolating risk. There are two classes of such models: (1) long-term formalisms that track premalignant cell numbers throughout an entire lifetime but treat initial radiation dose-response simplistically and (2) short-term formalisms that provide a detailed initial dose-response even for complicated radiation protocols, but address its modulation during the subsequent cancer latency period only indirectly. We argue that integrating short- and long-term models is needed. As an example of this novel approach, we integrate a stochastic short-term initiation/ inactivation/repopulation model with a deterministic two-stage long-term model. Within this new formalism, the following assumptions are implemented: radiation initiates, promotes, or kills pre-malignant cells; a pre-malignant cell generates a clone, which, if it survives, quickly reaches a size limitation; the clone subsequently grows more slowly and can eventually generate a malignant cell; the carcinogenic potential of pre-malignant cells decreases with age.

  15. A dynamic dual process model of risky decision making.

    PubMed

    Diederich, Adele; Trueblood, Jennifer S

    2018-03-01

    Many phenomena in judgment and decision making are often attributed to the interaction of 2 systems of reasoning. Although these so-called dual process theories can explain many types of behavior, they are rarely formalized as mathematical or computational models. Rather, dual process models are typically verbal theories, which are difficult to conclusively evaluate or test. In the cases in which formal (i.e., mathematical) dual process models have been proposed, they have not been quantitatively fit to experimental data and are often silent when it comes to the timing of the 2 systems. In the current article, we present a dynamic dual process model framework of risky decision making that provides an account of the timing and interaction of the 2 systems and can explain both choice and response-time data. We outline several predictions of the model, including how changes in the timing of the 2 systems as well as time pressure can influence behavior. The framework also allows us to explore different assumptions about how preferences are constructed by the 2 systems as well as the dynamic interaction of the 2 systems. In particular, we examine 3 different possible functional forms of the 2 systems and 2 possible ways the systems can interact (simultaneously or serially). We compare these dual process models with 2 single process models using risky decision making data from Guo, Trueblood, and Diederich (2017). Using this data, we find that 1 of the dual process models significantly outperforms the other models in accounting for both choices and response times. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  17. Answering Questions about Complex Events

    DTIC Science & Technology

    2008-12-19

    in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about

  18. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms.

    PubMed

    Terfve, Camille; Cokelaer, Thomas; Henriques, David; MacNamara, Aidan; Goncalves, Emanuel; Morris, Melody K; van Iersel, Martijn; Lauffenburger, Douglas A; Saez-Rodriguez, Julio

    2012-10-18

    Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context.

  19. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms

    PubMed Central

    2012-01-01

    Background Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Results Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Conclusions Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context. PMID:23079107

  20. A coordination theory for intelligent machines

    NASA Technical Reports Server (NTRS)

    Wang, Fei-Yue; Saridis, George N.

    1990-01-01

    A formal model for the coordination level of intelligent machines is established. The framework of the coordination level investigated consists of one dispatcher and a number of coordinators. The model called coordination structure has been used to describe analytically the information structure and information flow for the coordination activities in the coordination level. Specifically, the coordination structure offers a formalism to (1) describe the task translation of the dispatcher and coordinators; (2) represent the individual process within the dispatcher and coordinators; (3) specify the cooperation and connection among the dispatcher and coordinators; (4) perform the process analysis and evaluation; and (5) provide a control and communication mechanism for the real-time monitor or simulation of the coordination process. A simple procedure for the task scheduling in the coordination structure is presented. The task translation is achieved by a stochastic learning algorithm. The learning process is measured with entropy and its convergence is guaranteed. Finally, a case study of the coordination structure with three coordinators and one dispatcher for a simple intelligent manipulator system illustrates the proposed model and the simulation of the task processes performed on the model verifies the soundness of the theory.

  1. A Model of Comparative Ethics Education for Social Workers

    ERIC Educational Resources Information Center

    Pugh, Greg L.

    2017-01-01

    Social work ethics education models have not effectively engaged social workers in practice in formal ethical reasoning processes, potentially allowing personal bias to affect ethical decisions. Using two of the primary ethical models from medicine, a new social work ethics model for education and practical application is proposed. The strengths…

  2. Applying Schema Theory to Mass Media Information Processing: Moving toward a Formal Model.

    ERIC Educational Resources Information Center

    Wicks, Robert H.

    Schema theory may be significant in determining if and how news audiences process information. For any given news topic, people have from none to many schemata (cognitive structures that represent organized knowledge about a given concept or type of stimulus abstracted from prior experience) upon which to draw. Models of how schemata are used…

  3. Unequal Bargaining? Australia's Aviation Trade Relations with the United States

    NASA Technical Reports Server (NTRS)

    Solomon, Russell

    2001-01-01

    International aviation trade bargaining is distinguished by its use of a formal process of bilateral bargaining based on the reciprocal exchange of rights by states. Australia-United States aviation trade relations are currently without rancour, but this has not always been the case and in the late 1980s and early 1990s, their formal bilateral aviation negotiations were a forum for a bitter conflict between two competing international aviation policies. In seeking to explain the bilateral aviation outcomes between Australia and the United States and how Australia has sought to improve upon these, analytical frameworks derived from international political economy were considered, along with the bilateral bargaining process itself. The paper adopts a modified neorealist model and concludes that to understand how Australia has sought to improve upon these aviation outcomes, neorealist assumptions that relative power capabilities determine outcomes must be qualified by reference to the formal bilateral bargaining process. In particular, Australia's use of this process and its application of certain bargaining tactics within that process remain critical to understanding bilateral outcomes.

  4. A discriminatory function for prediction of protein-DNA interactions based on alpha shape modeling.

    PubMed

    Zhou, Weiqiang; Yan, Hong

    2010-10-15

    Protein-DNA interaction has significant importance in many biological processes. However, the underlying principle of the molecular recognition process is still largely unknown. As more high-resolution 3D structures of protein-DNA complex are becoming available, the surface characteristics of the complex become an important research topic. In our work, we apply an alpha shape model to represent the surface structure of the protein-DNA complex and developed an interface-atom curvature-dependent conditional probability discriminatory function for the prediction of protein-DNA interaction. The interface-atom curvature-dependent formalism captures atomic interaction details better than the atomic distance-based method. The proposed method provides good performance in discriminating the native structures from the docking decoy sets, and outperforms the distance-dependent formalism in terms of the z-score. Computer experiment results show that the curvature-dependent formalism with the optimal parameters can achieve a native z-score of -8.17 in discriminating the native structure from the highest surface-complementarity scored decoy set and a native z-score of -7.38 in discriminating the native structure from the lowest RMSD decoy set. The interface-atom curvature-dependent formalism can also be used to predict apo version of DNA-binding proteins. These results suggest that the interface-atom curvature-dependent formalism has a good prediction capability for protein-DNA interactions. The code and data sets are available for download on http://www.hy8.com/bioinformatics.htm kenandzhou@hotmail.com.

  5. Information system modeling for biomedical imaging applications

    NASA Astrophysics Data System (ADS)

    Hoo, Kent S., Jr.; Wong, Stephen T. C.

    1999-07-01

    Information system modeling has historically been relegated to a low priority among the designers of information systems. Often times, there is a rush to design and implement hardware and software solutions after only the briefest assessments of the domain requirements. Although this process results in a rapid development cycle, the system usually does not satisfy the needs of the users and the developers are forced to re-program certain aspects of the system. It would be much better to create an accurate model of the system based on the domain needs so that the implementation of the solution satisfies the needs of the users immediately. It would also be advantageous to build extensibility into the model so that updates to the system could be carried out in an organized fashion. The significance of this research is the development of a new formal framework for the construction of a multimedia medical information system. This formal framework is constructed using visual modeling which provides a way of thinking about problems using models organized around real- world ideas. These models provide an abstract way to view complex problems, making them easier for one to understand. The formal framework is the result of an object-oriented analysis and design process that translates the systems requirements and functionality into software models. The usefulness of this information framework is demonstrated with two different applications in epilepsy research and care, i.e., surgical planning of epilepsy and decision threshold determination.

  6. 25 CFR 42.8 - What are a student's due process rights in a formal disciplinary proceeding?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... EDUCATION STUDENT RIGHTS § 42.8 What are a student's due process rights in a formal disciplinary proceeding? A student has the following due process rights in a formal disciplinary proceeding: (a) The right to... 25 Indians 1 2010-04-01 2010-04-01 false What are a student's due process rights in a formal...

  7. A Typology for Modeling Processes in Clinical Guidelines and Protocols

    NASA Astrophysics Data System (ADS)

    Tu, Samson W.; Musen, Mark A.

    We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

  8. Changing the world: the design and implementation of comprehensive continuous integrated systems of care for individuals with co-occurring disorders.

    PubMed

    Minkoff, Kenneth; Cline, Christie A

    2004-12-01

    This article has described the CCISC model and the process of implementation of systemic implementation of co-occurring disorder services enhancements within the context of existing resources. Four projects were described as illustrations of current implementation activities. Clearly, there is need for improved services for these individuals, and increasing recognition of the need for systemic change models that are effective and efficient. The CCISC model has been recognized by SAMHSA as a consensus best practice for system design, and initial efforts at implementation appear to be promising. The existing toolkit may permit a more formal process of data-driven evaluation of system, program, clinician, and client outcomes, to better measure the effectiveness of this approach. Some projects have begun such formal evaluation processes, but more work is needed, not only with individual projects, but also to develop opportunities for multi-system evaluation, as more projects come on line.

  9. Biologically inspired information theory: Adaptation through construction of external reality models by living systems.

    PubMed

    Nakajima, Toshiyuki

    2015-12-01

    Higher animals act in the world using their external reality models to cope with the uncertain environment. Organisms that have not developed such information-processing organs may also have external reality models built in the form of their biochemical, physiological, and behavioral structures, acquired by natural selection through successful models constructed internally. Organisms subject to illusions would fail to survive in the material universe. How can organisms, or living systems in general, determine the external reality from within? This paper starts with a phenomenological model, in which the self constitutes a reality model developed through the mental processing of phenomena. Then, the it-from-bit concept is formalized using a simple mathematical model. For this formalization, my previous work on an algorithmic process is employed to constitute symbols referring to the external reality, called the inverse causality, with additional improvements to the previous work. Finally, as an extension of this model, the cognizers system model is employed to describe the self as one of many material entities in a world, each of which acts as a subject by responding to the surrounding entities. This model is used to propose a conceptual framework of information theory that can deal with both the qualitative (semantic) and quantitative aspects of the information involved in biological processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  11. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  12. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    NASA Astrophysics Data System (ADS)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  13. Determining informative priors for cognitive models.

    PubMed

    Lee, Michael D; Vanpaemel, Wolf

    2018-02-01

    The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. In the Bayesian approach to implementing, testing, and using cognitive models, assumptions can influence both the likelihood function of the model, usually corresponding to assumptions about psychological processes, and the prior distribution over model parameters, usually corresponding to assumptions about the psychological variables that influence those processes. The specification of the prior is unique to the Bayesian context, but often raises concerns that lead to the use of vague or non-informative priors in cognitive modeling. Sometimes the concerns stem from philosophical objections, but more often practical difficulties with how priors should be determined are the stumbling block. We survey several sources of information that can help to specify priors for cognitive models, discuss some of the methods by which this information can be formalized in a prior distribution, and identify a number of benefits of including informative priors in cognitive modeling. Our discussion is based on three illustrative cognitive models, involving memory retention, categorization, and decision making.

  14. Formal Methods Case Studies for DO-333

    NASA Technical Reports Server (NTRS)

    Cofer, Darren; Miller, Steven P.

    2014-01-01

    RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.

  15. 25 CFR 42.6 - When does due process require a formal disciplinary hearing?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false When does due process require a formal disciplinary... RIGHTS § 42.6 When does due process require a formal disciplinary hearing? Unless local school policies and procedures provide for less, a formal disciplinary hearing is required before a suspension in...

  16. One Giant Leap for Categorizers: One Small Step for Categorization Theory

    PubMed Central

    Smith, J. David; Ell, Shawn W.

    2015-01-01

    We explore humans’ rule-based category learning using analytic approaches that highlight their psychological transitions during learning. These approaches confirm that humans show qualitatively sudden psychological transitions during rule learning. These transitions contribute to the theoretical literature contrasting single vs. multiple category-learning systems, because they seem to reveal a distinctive learning process of explicit rule discovery. A complete psychology of categorization must describe this learning process, too. Yet extensive formal-modeling analyses confirm that a wide range of current (gradient-descent) models cannot reproduce these transitions, including influential rule-based models (e.g., COVIS) and exemplar models (e.g., ALCOVE). It is an important theoretical conclusion that existing models cannot explain humans’ rule-based category learning. The problem these models have is the incremental algorithm by which learning is simulated. Humans descend no gradient in rule-based tasks. Very different formal-modeling systems will be required to explain humans’ psychology in these tasks. An important next step will be to build a new generation of models that can do so. PMID:26332587

  17. Fluent, Fast, and Frugal? A Formal Model Evaluation of the Interplay between Memory, Fluency, and Comparative Judgments

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.; Erdfelder, Edgar; Pohl, Rudiger F.

    2011-01-01

    A new process model of the interplay between memory and judgment processes was recently suggested, assuming that retrieval fluency--that is, the speed with which objects are recognized--will determine inferences concerning such objects in a single-cue fashion. This aspect of the fluency heuristic, an extension of the recognition heuristic, has…

  18. Rational Approximations to Rational Models: Alternative Algorithms for Category Learning

    ERIC Educational Resources Information Center

    Sanborn, Adam N.; Griffiths, Thomas L.; Navarro, Daniel J.

    2010-01-01

    Rational models of cognition typically consider the abstract computational problems posed by the environment, assuming that people are capable of optimally solving those problems. This differs from more traditional formal models of cognition, which focus on the psychological processes responsible for behavior. A basic challenge for rational models…

  19. Quantum-like model for the adaptive dynamics of the genetic regulation of E. coli's metabolism of glucose/lactose.

    PubMed

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2012-06-01

    We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.

  20. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  1. The Role of Informal and Formal Leisure Activities in the Disablement Process

    ERIC Educational Resources Information Center

    Janke, Megan C.; Payne, Laura L.; Van Puymbroeck, Marieke

    2008-01-01

    The disablement process model has been used as a framework to investigate factors that accelerate or decelerate disablement among older adults. Although very little is known about the direct and moderating effects of involvement in leisure activities on the disablement process, research has suggested that participation in leisure activities may…

  2. Readability and Recall of Short Prose Passages: A Theoretical Analysis.

    ERIC Educational Resources Information Center

    Miller, James R.; Kintsch, Walter

    1980-01-01

    To support the view of readability as an interaction between a text and the reader's prose-processing capabilities, this article applies an extended and formalized version of the Kintch and van Dijk prose-processing model to 20 texts of varying readability. (Author/GSK)

  3. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  4. A discrimination-association model for decomposing component processes of the implicit association test.

    PubMed

    Stefanutti, Luca; Robusto, Egidio; Vianello, Michelangelo; Anselmi, Pasquale

    2013-06-01

    A formal model is proposed that decomposes the implicit association test (IAT) effect into three process components: stimuli discrimination, automatic association, and termination criterion. Both response accuracy and reaction time are considered. Four independent and parallel Poisson processes, one for each of the four label categories of the IAT, are assumed. The model parameters are the rate at which information accrues on the counter of each process and the amount of information that is needed before a response is given. The aim of this study is to present the model and an illustrative application in which the process components of a Coca-Pepsi IAT are decomposed.

  5. A Control Theory Model of Smoking

    PubMed Central

    Bobashev, Georgiy; Holloway, John; Solano, Eric; Gutkin, Boris

    2017-01-01

    We present a heuristic control theory model that describes smoking under restricted and unrestricted access to cigarettes. The model is based on the allostasis theory and uses a formal representation of a multiscale opponent process. The model simulates smoking behavior of an individual and produces both short-term (“loading up” after not smoking for a while) and long-term smoking patterns (e.g., gradual transition from a few cigarettes to one pack a day). By introducing a formal representation of withdrawal- and craving-like processes, the model produces gradual increases over time in withdrawal- and craving-like signals associated with abstinence and shows that after 3 months of abstinence, craving disappears. The model was programmed as a computer application allowing users to select simulation scenarios. The application links images of brain regions that are activated during the binge/intoxication, withdrawal, or craving with corresponding simulated states. The model was calibrated to represent smoking patterns described in peer-reviewed literature; however, it is generic enough to be adapted to other drugs, including cocaine and opioids. Although the model does not mechanistically describe specific neurobiological processes, it can be useful in prevention and treatment practices as an illustration of drug-using behaviors and expected dynamics of withdrawal and craving during abstinence. PMID:28868531

  6. A Formal Investigation of Human Spatial Control Skills: Mathematical Formalization, Skill Development, and Skill Assessment

    NASA Astrophysics Data System (ADS)

    Li, Bin

    Spatial control behaviors account for a large proportion of human everyday activities from normal daily tasks, such as reaching for objects, to specialized tasks, such as driving, surgery, or operating equipment. These behaviors involve intensive interactions within internal processes (i.e. cognitive, perceptual, and motor control) and with the physical world. This dissertation builds on a concept of interaction pattern and a hierarchical functional model. Interaction pattern represents a type of behavior synergy that humans coordinates cognitive, perceptual, and motor control processes. It contributes to the construction of the hierarchical functional model that delineates humans spatial control behaviors as the coordination of three functional subsystems: planning, guidance, and tracking/pursuit. This dissertation formalizes and validates these two theories and extends them for the investigation of human spatial control skills encompassing development and assessment. Specifically, this dissertation first presents an overview of studies in human spatial control skills encompassing definition, characteristic, development, and assessment, to provide theoretical evidence for the concept of interaction pattern and the hierarchical functional model. The following, the human experiments for collecting motion and gaze data and techniques to register and classify gaze data, are described. This dissertation then elaborates and mathematically formalizes the hierarchical functional model and the concept of interaction pattern. These theories then enables the construction of a succinct simulation model that can reproduce a variety of human performance with a minimal set of hypotheses. This validates the hierarchical functional model as a normative framework for interpreting human spatial control behaviors. The dissertation then investigates human skill development and captures the emergence of interaction pattern. The final part of the dissertation applies the hierarchical functional model for skill assessment and introduces techniques to capture interaction patterns both from the top down using their geometric features and from the bottom up using their dynamical characteristics. The validity and generality of the skill assessment is illustrated using two the remote-control flight and laparoscopic surgical training experiments.

  7. Modeling Narrative Discourse

    ERIC Educational Resources Information Center

    Elson, David K.

    2012-01-01

    This thesis describes new approaches to the formal modeling of narrative discourse. Although narratives of all kinds are ubiquitous in daily life, contemporary text processing techniques typically do not leverage the aspects that separate narrative from expository discourse. We describe two approaches to the problem. The first approach considers…

  8. Validating archetypes for the Multiple Sclerosis Functional Composite.

    PubMed

    Braun, Michael; Brandt, Alexander Ulrich; Schulz, Stefan; Boeker, Martin

    2014-08-03

    Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions.This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model.

  9. Validating archetypes for the Multiple Sclerosis Functional Composite

    PubMed Central

    2014-01-01

    Background Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. Methods A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Results Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. Conclusions The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions. This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model. PMID:25087081

  10. Learning the Norm of Internality: NetNorm, a Connectionist Model

    ERIC Educational Resources Information Center

    Thierry, Bollon; Adeline, Paignon; Pascal, Pansu

    2011-01-01

    The objective of the present article is to show that connectionist simulations can be used to model some of the socio-cognitive processes underlying the learning of the norm of internality. For our simulations, we developed a connectionist model which we called NetNorm (based on Dual-Network formalism). This model is capable of simulating the…

  11. Tempo: A Toolkit for the Timed Input/Output Automata Formalism

    DTIC Science & Technology

    2008-01-30

    generation of distributed code from specifications. F .4.3 [Formal Languages]: Tempo;, D.3 [Programming Many distributed systems involve a combination of...and require The chek (i) transition is enabled when process i’s program the simulator to check the assertions after every single step counter is set to...output foo (n:Int) The Tempo simulator addresses this issue by putting the states x: Int : = 10;transitions modeler in charge of resolving the non

  12. A Retrieval Model for Both Recognition and Recall.

    ERIC Educational Resources Information Center

    Gillund, Gary; Shiffrin, Richard M.

    1984-01-01

    The Search of Associative Memory (SAM) model for recall is extended by assuming that a familiarity process is used for recognition. The model, formalized in a computer simulation program, correctly predicts a number of findings in the literature as well as results from an experiment on the word-frequency effect. (Author/BW)

  13. 78 FR 35826 - Unfair Competitive Advantages; Enhancement of the Formal Complaint Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-14

    ... Postal Service to the time and expense of the discovery process. The Commission anticipates that allowing.... 1739] Unfair Competitive Advantages; Enhancement of the Formal Complaint Process AGENCY: Postal... enhance the formal complaint process in cases involving alleged violations of a law that prohibits the...

  14. Analysis of Phase-Type Stochastic Petri Nets With Discrete and Continuous Timing

    NASA Technical Reports Server (NTRS)

    Jones, Robert L.; Goode, Plesent W. (Technical Monitor)

    2000-01-01

    The Petri net formalism is useful in studying many discrete-state, discrete-event systems exhibiting concurrency, synchronization, and other complex behavior. As a bipartite graph, the net can conveniently capture salient aspects of the system. As a mathematical tool, the net can specify an analyzable state space. Indeed, one can reason about certain qualitative properties (from state occupancies) and how they arise (the sequence of events leading there). By introducing deterministic or random delays, the model is forced to sojourn in states some amount of time, giving rise to an underlying stochastic process, one that can be specified in a compact way and capable of providing quantitative, probabilistic measures. We formalize a new non-Markovian extension to the Petri net that captures both discrete and continuous timing in the same model. The approach affords efficient, stationary analysis in most cases and efficient transient analysis under certain restrictions. Moreover, this new formalism has the added benefit in modeling fidelity stemming from the simultaneous capture of discrete- and continuous-time events (as opposed to capturing only one and approximating the other). We show how the underlying stochastic process, which is non-Markovian, can be resolved into simpler Markovian problems that enjoy efficient solutions. Solution algorithms are provided that can be easily programmed.

  15. Applying Evidence-Based Medicine in Telehealth: An Interactive Pattern Recognition Approximation

    PubMed Central

    Fernández-Llatas, Carlos; Meneu, Teresa; Traver, Vicente; Benedi, José-Miguel

    2013-01-01

    Born in the early nineteen nineties, evidence-based medicine (EBM) is a paradigm intended to promote the integration of biomedical evidence into the physicians daily practice. This paradigm requires the continuous study of diseases to provide the best scientific knowledge for supporting physicians in their diagnosis and treatments in a close way. Within this paradigm, usually, health experts create and publish clinical guidelines, which provide holistic guidance for the care for a certain disease. The creation of these clinical guidelines requires hard iterative processes in which each iteration supposes scientific progress in the knowledge of the disease. To perform this guidance through telehealth, the use of formal clinical guidelines will allow the building of care processes that can be interpreted and executed directly by computers. In addition, the formalization of clinical guidelines allows for the possibility to build automatic methods, using pattern recognition techniques, to estimate the proper models, as well as the mathematical models for optimizing the iterative cycle for the continuous improvement of the guidelines. However, to ensure the efficiency of the system, it is necessary to build a probabilistic model of the problem. In this paper, an interactive pattern recognition approach to support professionals in evidence-based medicine is formalized. PMID:24185841

  16. Power-law modeling based on least-squares minimization criteria.

    PubMed

    Hernández-Bermejo, B; Fairén, V; Sorribas, A

    1999-10-01

    The power-law formalism has been successfully used as a modeling tool in many applications. The resulting models, either as Generalized Mass Action or as S-systems models, allow one to characterize the target system and to simulate its dynamical behavior in response to external perturbations and parameter changes. The power-law formalism was first derived as a Taylor series approximation in logarithmic space for kinetic rate-laws. The especial characteristics of this approximation produce an extremely useful systemic representation that allows a complete system characterization. Furthermore, their parameters have a precise interpretation as local sensitivities of each of the individual processes and as rate-constants. This facilitates a qualitative discussion and a quantitative estimation of their possible values in relation to the kinetic properties. Following this interpretation, parameter estimation is also possible by relating the systemic behavior to the underlying processes. Without leaving the general formalism, in this paper we suggest deriving the power-law representation in an alternative way that uses least-squares minimization. The resulting power-law mimics the target rate-law in a wider range of concentration values than the classical power-law. Although the implications of this alternative approach remain to be established, our results show that the predicted steady-state using the least-squares power-law is closest to the actual steady-state of the target system.

  17. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    NASA Astrophysics Data System (ADS)

    dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n -mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherence and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [

    F. Dell’Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)
    ], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.

  18. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n-mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherencemore » and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [F. Dell'Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.« less

  19. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  20. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  1. Optimizing Outcome in the University-Industry Technology Transfer Projects

    NASA Astrophysics Data System (ADS)

    Alavi, Hamed; Hąbek, Patrycja

    2016-06-01

    Transferring inventions of academic scientists to private enterprises for the purpose of commercialization is long known as University-Industry (firm) Technology Transfer While the importance of this phenomenon is simultaneously raising in public and private sector, only a part of patented academic inventions succeed in passing the process of commercialization. Despite the fact that formal Technology Transfer process and licencing of patented innovations to third party is the main legal tool for safeguarding rights of academic inventors in commercialization of their inventions, it is not sufficient for transmitting tacit knowledge which is necessary in exploitation of transferred technology. Existence of reciprocal and complementary relations between formal and informal technology transfer process has resulted in formation of different models for university-industry organizational collaboration or even integration where licensee firms keep contact with academic inventors after gaining legal right for commercialization of their patented invention. Current paper argues that despite necessity for patents to legally pass the right of commercialization of an invention, they are not sufficient for complete knowledge transmission in the process of technology transfer. Lack of efficiency of formal mechanism to end the Technology Transfer loop makes an opportunity to create innovative interpersonal and organizational connections among patentee and licensee company. With emphasize on need for further elaboration of informal mechanisms as critical and underappreciated aspect of technology transfer process, article will try to answer the questions of how to optimize knowledge transmission process in the framework of University-Industry Technology Transfer Projects? What is the theoretical basis for university-industry technology transfer process? What are organization collaborative models which can enhance overall performance by improving transmission of knowledge in University- Firm Technology Transfer process?

  2. Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.

    PubMed

    Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A

    2017-03-01

    Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.

  3. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  4. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  5. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  6. UML activity diagrams in requirements specification of logic controllers

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał

    2015-12-01

    Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.

  7. Heterogeneous continuous-time random walks

    NASA Astrophysics Data System (ADS)

    Grebenkov, Denis S.; Tupikina, Liubov

    2018-01-01

    We introduce a heterogeneous continuous-time random walk (HCTRW) model as a versatile analytical formalism for studying and modeling diffusion processes in heterogeneous structures, such as porous or disordered media, multiscale or crowded environments, weighted graphs or networks. We derive the exact form of the propagator and investigate the effects of spatiotemporal heterogeneities onto the diffusive dynamics via the spectral properties of the generalized transition matrix. In particular, we show how the distribution of first-passage times changes due to local and global heterogeneities of the medium. The HCTRW formalism offers a unified mathematical language to address various diffusion-reaction problems, with numerous applications in material sciences, physics, chemistry, biology, and social sciences.

  8. Building an information model (with the help of PSL/PSA). [Problem Statement Language/Problem Statement Analyzer

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Farny, A. M.

    1983-01-01

    Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.

  9. Mathematical model of the loan portfolio dynamics in the form of Markov chain considering the process of new customers attraction

    NASA Astrophysics Data System (ADS)

    Bozhalkina, Yana

    2017-12-01

    Mathematical model of the loan portfolio structure change in the form of Markov chain is explored. This model considers in one scheme both the process of customers attraction, their selection based on the credit score, and loans repayment. The model describes the structure and volume of the loan portfolio dynamics, which allows to make medium-term forecasts of profitability and risk. Within the model corrective actions of bank management in order to increase lending volumes or to reduce the risk are formalized.

  10. Using Petri Net Tools to Study Properties and Dynamics of Biological Systems

    PubMed Central

    Peleg, Mor; Rubin, Daniel; Altman, Russ B.

    2005-01-01

    Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791

  11. Integrating Research into Decision Making: Providing Examples for an Informal Action Research Model. Research Report No. 83-24.

    ERIC Educational Resources Information Center

    Losak, John; Morris, Cathy

    One promising avenue for increasing the utilization of institutional research data is the informal action research model. While formal action research stresses the involvement of researchers throughout the decision-making process, the informal model stresses participation in the later stages of decision making. Informal action research requires…

  12. 25 CFR 42.7 - What does due process in a formal disciplinary proceeding include?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What does due process in a formal disciplinary proceeding... RIGHTS § 42.7 What does due process in a formal disciplinary proceeding include? Due process must include... rendering a disciplinary decision. (b) The school must hold a fair and impartial hearing before imposing...

  13. An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi

    NASA Astrophysics Data System (ADS)

    Deng, D.-P.; Lemmens, R.

    2011-08-01

    The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.

  14. Influence of credit scoring on the dynamics of Markov chain

    NASA Astrophysics Data System (ADS)

    Galina, Timofeeva

    2015-11-01

    Markov processes are widely used to model the dynamics of a credit portfolio and forecast the portfolio risk and profitability. In the Markov chain model the loan portfolio is divided into several groups with different quality, which determined by presence of indebtedness and its terms. It is proposed that dynamics of portfolio shares is described by a multistage controlled system. The article outlines mathematical formalization of controls which reflect the actions of the bank's management in order to improve the loan portfolio quality. The most important control is the organization of approval procedure of loan applications. The credit scoring is studied as a control affecting to the dynamic system. Different formalizations of "good" and "bad" consumers are proposed in connection with the Markov chain model.

  15. Formally verifying Ada programs which use real number types

    NASA Technical Reports Server (NTRS)

    Sutherland, David

    1986-01-01

    Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.

  16. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  17. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-05-01

    Information Technology and Business Process Redesign | MIT Sloan Management Review . MIT Sloan Management Review . Retrieved from http://sloanreview.mit.edu...links systems management to process execution Three Phases/ Multi-Year Effort (This Phase) Literature review Model development— Formal and...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

  18. Scale dependent inference in landscape genetics

    Treesearch

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Ecological relationships between patterns and processes are highly scale dependent. This paper reports the first formal exploration of how changing scale of research away from the scale of the processes governing gene flow affects the results of landscape genetic analysis. We used an individual-based, spatially explicit simulation model to generate patterns of genetic...

  19. An Exploration of Teachers' and Administrators' Perspectives: The Collaborative Process Using the Danielson Framework for Teaching Model

    ERIC Educational Resources Information Center

    Landolfi, Adrienne M.

    2016-01-01

    As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…

  20. Stochastic model predicts evolving preferences in the Iowa gambling task

    PubMed Central

    Fuentes, Miguel A.; Lavín, Claudio; Contreras-Huerta, L. Sebastián; Miguel, Hernan; Rosales Jubal, Eduardo

    2014-01-01

    Learning under uncertainty is a common task that people face in their daily life. This process relies on the cognitive ability to adjust behavior to environmental demands. Although the biological underpinnings of those cognitive processes have been extensively studied, there has been little work in formal models seeking to capture the fundamental dynamic of learning under uncertainty. In the present work, we aimed to understand the basic cognitive mechanisms of outcome processing involved in decisions under uncertainty and to evaluate the relevance of previous experiences in enhancing learning processes within such uncertain context. We propose a formal model that emulates the behavior of people playing a well established paradigm (Iowa Gambling Task - IGT) and compare its outcome with a behavioral experiment. We further explored whether it was possible to emulate maladaptive behavior observed in clinical samples by modifying the model parameter which controls the update of expected outcomes distributions. Results showed that the performance of the model resembles the observed participant performance as well as IGT performance by healthy subjects described in the literature. Interestingly, the model converges faster than some subjects on the decks with higher net expected outcome. Furthermore, the modified version of the model replicated the trend observed in clinical samples performing the task. We argue that the basic cognitive component underlying learning under uncertainty can be represented as a differential equation that considers the outcomes of previous decisions for guiding the agent to an adaptive strategy. PMID:25566043

  1. Stochastic model predicts evolving preferences in the Iowa gambling task.

    PubMed

    Fuentes, Miguel A; Lavín, Claudio; Contreras-Huerta, L Sebastián; Miguel, Hernan; Rosales Jubal, Eduardo

    2014-01-01

    Learning under uncertainty is a common task that people face in their daily life. This process relies on the cognitive ability to adjust behavior to environmental demands. Although the biological underpinnings of those cognitive processes have been extensively studied, there has been little work in formal models seeking to capture the fundamental dynamic of learning under uncertainty. In the present work, we aimed to understand the basic cognitive mechanisms of outcome processing involved in decisions under uncertainty and to evaluate the relevance of previous experiences in enhancing learning processes within such uncertain context. We propose a formal model that emulates the behavior of people playing a well established paradigm (Iowa Gambling Task - IGT) and compare its outcome with a behavioral experiment. We further explored whether it was possible to emulate maladaptive behavior observed in clinical samples by modifying the model parameter which controls the update of expected outcomes distributions. Results showed that the performance of the model resembles the observed participant performance as well as IGT performance by healthy subjects described in the literature. Interestingly, the model converges faster than some subjects on the decks with higher net expected outcome. Furthermore, the modified version of the model replicated the trend observed in clinical samples performing the task. We argue that the basic cognitive component underlying learning under uncertainty can be represented as a differential equation that considers the outcomes of previous decisions for guiding the agent to an adaptive strategy.

  2. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    PubMed

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  3. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  4. eSPEM - A SPEM Extension for Enactable Behavior Modeling

    NASA Astrophysics Data System (ADS)

    Ellner, Ralf; Al-Hilank, Samir; Drexler, Johannes; Jung, Martin; Kips, Detlef; Philippsen, Michael

    OMG's SPEM - by means of its (semi-)formal notation - allows for a detailed description of development processes and methodologies, but can only be used for a rather coarse description of their behavior. Concepts for a more fine-grained behavior model are considered out of scope of the SPEM standard and have to be provided by other standards like BPDM/BPMN or UML. However, a coarse granularity of the behavior model often impedes a computer-aided enactment of a process model. Therefore, in this paper we present eSPEM, an extension of SPEM, that is based on the UML meta-model and focused on fine-grained behavior and life-cycle modeling and thereby supports automated enactment of development processes.

  5. Elicitation of quantitative data from a heterogeneous expert panel: formal process and application in animal health.

    PubMed

    Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S

    2002-02-01

    This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.

  6. Purposeful Design of Formal Laboratory Instruction as a Springboard to Research Participation

    ERIC Educational Resources Information Center

    Cartrette, David P.; Miller, Matthew L.

    2013-01-01

    An innovative first- and second-year laboratory course sequence is described. The goal of the instructional model is to introduce chemistry and biochemistry majors to the process of research participation earlier in their academic training. To achieve that goal, the instructional model incorporates significant hands-on experiences with chemical…

  7. Plasticity of Grammatical Recursion in German Learners of Dutch

    ERIC Educational Resources Information Center

    Davidson, Douglas J.; Indefrey, Peter

    2009-01-01

    Previous studies have examined cross-serial and embedded complement clauses in West Germanic in order to distinguish between different types of working memory models of human sentence processing, as well as different formal language models. Here, adult plasticity in the use of these constructions is investigated by examining the response of…

  8. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  9. Formal Darwinism, the individual-as-maximizing-agent analogy and bet-hedging

    PubMed Central

    Grafen, A.

    1999-01-01

    The central argument of The origin of species was that mechanical processes (inheritance of features and the differential reproduction they cause) can give rise to the appearance of design. The 'mechanical processes' are now mathematically represented by the dynamic systems of population genetics, and the appearance of design by optimization and game theory in which the individual plays the part of the maximizing agent. Establishing a precise individual-as-maximizing-agent (IMA) analogy for a population-genetics system justifies optimization approaches, and so provides a modern formal representation of the core of Darwinism. It is a hitherto unnoticed implication of recent population-genetics models that, contrary to a decades-long consensus, an IMA analogy can be found in models with stochastic environments (subject to a convexity assumption), in which individuals maximize expected reproductive value. The key is that the total reproductive value of a species must be considered as constant, so therefore reproductive value should always be calculated in relative terms. This result removes a major obstacle from the theoretical challenge to find a unifying framework which establishes the IMA analogy for all of Darwinian biology, including as special cases inclusive fitness, evolutionarily stable strategies, evolutionary life-history theory, age-structured models and sex ratio theory. This would provide a formal, mathematical justification of fruitful and widespread but 'intentional' terms in evolutionary biology, such as 'selfish', 'altruism' and 'conflict'.

  10. Theory Creation, Modification, and Testing: An Information-Processing Model and Theory of the Anticipated and Unanticipated Consequences of Research and Development

    ERIC Educational Resources Information Center

    Perla, Rocco J.; Carifio, James

    2011-01-01

    Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…

  11. Integrated control system for electron beam processes

    NASA Astrophysics Data System (ADS)

    Koleva, L.; Koleva, E.; Batchkova, I.; Mladenov, G.

    2018-03-01

    The ISO/IEC 62264 standard is widely used for integration of the business systems of a manufacturer with the corresponding manufacturing control systems based on hierarchical equipment models, functional data and manufacturing operations activity models. In order to achieve the integration of control systems, formal object communication models must be developed, together with manufacturing operations activity models, which coordinate the integration between different levels of control. In this article, the development of integrated control system for electron beam welding process is presented as part of a fully integrated control system of an electron beam plant, including also other additional processes: surface modification, electron beam evaporation, selective melting and electron beam diagnostics.

  12. The pressure recovery ratio: The invasive index of LV relaxation during filling. Model-based prediction with in-vivo validation.

    PubMed

    Zhang, Wei; Shmuylovich, Leonid; Kovacs, Sandor J

    2009-01-01

    Using a simple harmonic oscillator model (PDF formalism), every early filling E-wave can be uniquely described by a set of parameters, (x(0), c, and k). Parameter c in the PDF formalism is a damping or relaxation parameter that measures the energy loss during the filling process. Based on Bernoulli's equation and kinematic modeling, we derived a causal correlation between the relaxation parameter c in the PDF formalism and a feature of the pressure contour during filling - the pressure recovery ratio defined by the left ventricular pressure difference between diastasis and minimum pressure, normalized to the pressure difference between a fiducial pressure and minimum pressure [PRR = (P(Diastasis)-P(Min))/(P(Fiducial)-P(Min))]. We analyzed multiple heart beats from one human subject to validate the correlation. Further validation among more patients is warranted. PRR is the invasive causal analogue of the noninvasive E-wave relaxation parameter c. PRR has the potential to be calculated using automated methodology in the catheterization lab in real time.

  13. Don't abandon hope all ye who enter here: The protective role of formal mentoring and learning processes on burnout in correctional officers.

    PubMed

    Farnese, M L; Barbieri, B; Bellò, B; Bartone, P T

    2017-01-01

    Within a Job Demands-Resources Model framework, formal mentoring can be conceived as a job resource expressing the organization's support for new members, which may prevent their being at risk for burnout. This research aims at understanding the protective role of formal mentoring on burnout, through the effect of increasing learning personal resources. Specifically, we hypothesized that formal mentoring enhances newcomers' learning about job and social domains related to the new work context, thus leading to lower burnout. In order to test the hypotheses, a multiple regression analysis using the bootstrapping method was used. Based on a questionnaire administered to 117 correctional officer newcomers who had a formal mentor assigned, our results confirm that formal mentoring exerts a positive influence on newcomers' adjustment, and that this in turn exerts a protective influence against burnout onset by reducing cynicism and interpersonal stress and also enhancing the sense of personal accomplishment. Confirming previous literature's suggestions, supportive mentoring and effective socialization seem to represent job and personal resources that are protective against burnout. This study provides empirical support for this relation in the prison context.

  14. A Semantic Approach for Geospatial Information Extraction from Unstructured Documents

    NASA Astrophysics Data System (ADS)

    Sallaberry, Christian; Gaio, Mauro; Lesbegueries, Julien; Loustau, Pierre

    Local cultural heritage document collections are characterized by their content, which is strongly attached to a territory and its land history (i.e., geographical references). Our contribution aims at making the content retrieval process more efficient whenever a query includes geographic criteria. We propose a core model for a formal representation of geographic information. It takes into account characteristics of different modes of expression, such as written language, captures of drawings, maps, photographs, etc. We have developed a prototype that fully implements geographic information extraction (IE) and geographic information retrieval (IR) processes. All PIV prototype processing resources are designed as Web Services. We propose a geographic IE process based on semantic treatment as a supplement to classical IE approaches. We implement geographic IR by using intersection computing algorithms that seek out any intersection between formal geocoded representations of geographic information in a user query and similar representations in document collection indexes.

  15. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. 18 CFR 5.14 - Formal study dispute resolution process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Formal study dispute... PROCESS § 5.14 Formal study dispute resolution process. (a) Within 20 days of the Study Plan Determination... under section 401 of the Clean Water Act, 42 U.S.C. 1341, may file a notice of study dispute with...

  17. Simulating Technology Processes to Foster Learning.

    ERIC Educational Resources Information Center

    Krumholtz, Nira

    1998-01-01

    Based on a spiral model of technology evolution, elementary students used LOGO computer software to become both developers and users of technology. The computerized environment enabled 87% to reach intuitive understanding of physical concepts; 24% expressed more formal scientific understanding. (SK)

  18. Inductive Reasoning about Causally Transmitted Properties

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D.; Tenenbaum, Joshua B.

    2008-01-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates'…

  19. A Harris-Todaro Agent-Based Model to Rural-Urban Migration

    NASA Astrophysics Data System (ADS)

    Espíndola, Aquino L.; Silveira, Jaylson J.; Penna, T. J. P.

    2006-09-01

    The Harris-Todaro model of the rural-urban migration process is revisited under an agent-based approach. The migration of the workers is interpreted as a process of social learning by imitation, formalized by a computational model. By simulating this model, we observe a transitional dynamics with continuous growth of the urban fraction of overall population toward an equilibrium. Such an equilibrium is characterized by stabilization of rural-urban expected wages differential (generalized Harris-Todaro equilibrium condition), urban concentration and urban unemployment. These classic results obtained originally by Harris and Todaro are emergent properties of our model.

  20. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  1. Petri net based model of the body iron homeostasis.

    PubMed

    Formanowicz, Dorota; Sackmann, Andrea; Formanowicz, Piotr; Błazewicz, Jacek

    2007-10-01

    The body iron homeostasis is a not fully understood complex process. Despite the fact that some components of this process have been described in the literature, the complete model of the whole process has not been proposed. In this paper a Petri net based model of the body iron homeostasis is presented. Recently, Petri nets have been used for describing and analyzing various biological processes since they allow modeling the system under consideration very precisely. The main result presented in the paper is twofold, i.e., an informal description of the main part of the whole iron homeostasis process is described, and then it is also formulated in the formal language of Petri net theory. This model allows for a possible simulation of the process, since Petri net theory provides a lot of established analysis techniques.

  2. Priority in Process Algebras

    NASA Technical Reports Server (NTRS)

    Cleaveland, Rance; Luettgen, Gerald; Natarajan, V.

    1999-01-01

    This paper surveys the semantic ramifications of extending traditional process algebras with notions of priority that allow for some transitions to be given precedence over others. These enriched formalisms allow one to model system features such as interrupts, prioritized choice, or real-time behavior. Approaches to priority in process algebras can be classified according to whether the induced notion of preemption on transitions is global or local and whether priorities are static or dynamic. Early work in the area concentrated on global pre-emption and static priorities and led to formalisms for modeling interrupts and aspects of real-time, such as maximal progress, in centralized computing environments. More recent research has investigated localized notions of pre-emption in which the distribution of systems is taken into account, as well as dynamic priority approaches, i.e., those where priority values may change as systems evolve. The latter allows one to model behavioral phenomena such as scheduling algorithms and also enables the efficient encoding of real-time semantics. Technically, this paper studies the different models of priorities by presenting extensions of Milner's Calculus of Communicating Systems (CCS) with static and dynamic priority as well as with notions of global and local pre- emption. In each case the operational semantics of CCS is modified appropriately, behavioral theories based on strong and weak bisimulation are given, and related approaches for different process-algebraic settings are discussed.

  3. Algorithms in the historical emergence of word senses.

    PubMed

    Ramiro, Christian; Srinivasan, Mahesh; Malt, Barbara C; Xu, Yang

    2018-03-06

    Human language relies on a finite lexicon to express a potentially infinite set of ideas. A key result of this tension is that words acquire novel senses over time. However, the cognitive processes that underlie the historical emergence of new word senses are poorly understood. Here, we present a computational framework that formalizes competing views of how new senses of a word might emerge by attaching to existing senses of the word. We test the ability of the models to predict the temporal order in which the senses of individual words have emerged, using an historical lexicon of English spanning the past millennium. Our findings suggest that word senses emerge in predictable ways, following an historical path that reflects cognitive efficiency, predominantly through a process of nearest-neighbor chaining. Our work contributes a formal account of the generative processes that underlie lexical evolution.

  4. Reaching beyond the review of research evidence: a qualitative study of decision making during the development of clinical practice guidelines for disease prevention in healthcare.

    PubMed

    Richter Sundberg, Linda; Garvare, Rickard; Nyström, Monica Elisabeth

    2017-05-11

    The judgment and decision making process during guideline development is central for producing high-quality clinical practice guidelines, but the topic is relatively underexplored in the guideline research literature. We have studied the development process of national guidelines with a disease-prevention scope produced by the National board of Health and Welfare (NBHW) in Sweden. The NBHW formal guideline development model states that guideline recommendations should be based on five decision-criteria: research evidence; curative/preventive effect size, severity of the condition; cost-effectiveness; and ethical considerations. A group of health profession representatives (i.e. a prioritization group) was assigned the task of ranking condition-intervention pairs for guideline recommendations, taking into consideration the multiple decision criteria. The aim of this study was to investigate the decision making process during the two-year development of national guidelines for methods of preventing disease. A qualitative inductive longitudinal case study approach was used to investigate the decision making process. Questionnaires, non-participant observations of nine two-day group meetings, and documents provided data for the analysis. Conventional and summative qualitative content analysis was used to analyse data. The guideline development model was modified ad-hoc as the group encountered three main types of dilemmas: high quality evidence vs. low adoptability of recommendation; insufficient evidence vs. high urgency to act; and incoherence in assessment and prioritization within and between four different lifestyle areas. The formal guideline development model guided the decision-criteria used, but three new or revised criteria were added by the group: 'clinical knowledge and experience', 'potential guideline consequences' and 'needs of vulnerable groups'. The frequency of the use of various criteria in discussions varied over time. Gender, professional status, and interpersonal skills were perceived to affect individuals' relative influence on group discussions. The study shows that guideline development groups make compromises between rigour and pragmatism. The formal guideline development model incorporated multiple aspects, but offered few details on how the different criteria should be handled. The guideline development model devoted little attention to the role of the decision-model and group-related factors. Guideline development models could benefit from clarifying the role of the group-related factors and non-research evidence, such as clinical experience and ethical considerations, in decision-processes during guideline development.

  5. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  6. Mathematics and Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, Gerald

    1979-01-01

    Examines the main mathematical approaches to information retrieval, including both algebraic and probabilistic models, and describes difficulties which impede formalization of information retrieval processes. A number of developments are covered where new theoretical understandings have directly led to improved retrieval techniques and operations.…

  7. Phonon-Assisted Optical Absorption in Silicon from First Principles

    NASA Astrophysics Data System (ADS)

    Noffsinger, Jesse; Kioupakis, Emmanouil; Van de Walle, Chris G.; Louie, Steven G.; Cohen, Marvin L.

    2012-04-01

    The phonon-assisted interband optical absorption spectrum of silicon is calculated at the quasiparticle level entirely from first principles. We make use of the Wannier interpolation formalism to determine the quasiparticle energies, as well as the optical transition and electron-phonon coupling matrix elements, on fine grids in the Brillouin zone. The calculated spectrum near the onset of indirect absorption is in very good agreement with experimental measurements for a range of temperatures. Moreover, our method can accurately determine the optical absorption spectrum of silicon in the visible range, an important process for optoelectronic and photovoltaic applications that cannot be addressed with simple models. The computational formalism is quite general and can be used to understand the phonon-assisted absorption processes in general.

  8. Using the Unified Modelling Language (UML) to guide the systemic description of biological processes and systems.

    PubMed

    Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille

    2004-07-01

    One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.

  9. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.

  10. Lynx conservation in an ecosystem management context [Chapter 15

    Treesearch

    Kevin S. McKelvey; Keith B. Aubry; James K. Agee; Steven W. Buskirk; Leonard F. Ruggiero; Gary M. Koehler

    2000-01-01

    In an ecosystem management context, management for lynx must occur in the context of the needs of other species, watershed health, and a variety of products, outputs, and uses. This chapter presents a management model based on the restoration of historical patterns and processes. We argue that this model is sustainable in a formal sense, practical, and likely...

  11. From Status to Power: New Models at the Intersection of Two Theories

    ERIC Educational Resources Information Center

    Thye, Shane R.; Willer, David; Markovsky, Barry

    2006-01-01

    The study of group processes has benefited from longstanding programs of theory-driven research on status and power. The present work constructs a bridge between two formal theories of status and power: Status Characteristics Theory and Network Exchange Theory. Two theoretical models, one for "status value" and one for "status influence,"…

  12. The Source of Adult Age Differences in Event-Based Prospective Memory: A Multinomial Modeling Approach

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.

    2006-01-01

    Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…

  13. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  14. Process Definition and Modeling Guidebook. Version 01.00.02

    DTIC Science & Technology

    1992-12-01

    material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from

  15. A Jury of Their Peers: A Meta-Analysis of the Effects of Teen Court on Criminal Recidivism.

    PubMed

    Bouchard, Jessica; Wong, Jennifer S

    2017-07-01

    Juvenile delinquency has been on the decline for a number of years, yet, juvenile courts continue to assess more than 1 million cases per year. Involvement with the juvenile justice system has been linked to a number of risk factors and consequences that may impact positive youth development; however, evidence-based correctional programs that divert juvenile offenders away from formal processing are limited. Teen Court is a specialized diversion intervention that offers an alternative to traditional court processing for juvenile offenders. Despite the rapid expansion of Teen Courts, there is little comprehensive and systematic evidence available to justify this expansion. This meta-analytic study examines the effects of Teen Court on the recidivism of juvenile offenders. The literature search resulted in the selection of 14 studies, which contributed 18 unique effect sizes with a total sample of 2125 treatment group and 979 comparison group youth. The findings suggest that Teen Court is no more effective at reducing recidivism than (a) formal processing or (b) other diversion programs. Implications of formal and informal court processing for low-risk, first-time young offenders are discussed. The authors draw on the Risk-Need-Responsivity model to provide recommendations for policies and practices.

  16. Determination of thermophysical characteristics of vulcanizable rubber products by the mathematical modeling method

    NASA Astrophysics Data System (ADS)

    Tikhomirov, S. G.; Pyatakov, Y. V.; Karmanova, O. V.; Maslov, A. A.

    2018-03-01

    The studies of the vulcanization kinetics of elastomers were carried out using a Truck tyre tread rubber compound. The formal kinetic scheme of vulcanization of rubbers sulfur-accelerator curing system was used which generalizes the set of reactions occurring in the curing process. A mathematical model is developed for determining the thermal parameters vulcanizable mixture comprising algorithms for solving direct and inverse problems for system of equations of heat conduction and kinetics of the curing process. The performance of the model is confirmed by the results of numerical experiments on model examples.

  17. Dual Rationality and Deliberative Agents

    NASA Astrophysics Data System (ADS)

    Debenham, John; Sierra, Carles

    Human agents deliberate using models based on reason for only a minute proportion of the decisions that they make. In stark contrast, the deliberation of artificial agents is heavily dominated by formal models based on reason such as game theory, decision theory and logic—despite that fact that formal reasoning will not necessarily lead to superior real-world decisions. Further the Nobel Laureate Friedrich Hayek warns us of the ‘fatal conceit’ in controlling deliberative systems using models based on reason as the particular model chosen will then shape the system’s future and either impede, or eventually destroy, the subtle evolutionary processes that are an integral part of human systems and institutions, and are crucial to their evolution and long-term survival. We describe an architecture for artificial agents that is founded on Hayek’s two rationalities and supports the two forms of deliberation used by mankind.

  18. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  19. Idea Paper: The Lifecycle of Software for Scientific Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; McInnes, Lois C.

    The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less

  20. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  1. Explaining neural signals in human visual cortex with an associative learning model.

    PubMed

    Jiang, Jiefeng; Schmajuk, Nestor; Egner, Tobias

    2012-08-01

    "Predictive coding" models posit a key role for associative learning in visual cognition, viewing perceptual inference as a process of matching (learned) top-down predictions (or expectations) against bottom-up sensory evidence. At the neural level, these models propose that each region along the visual processing hierarchy entails one set of processing units encoding predictions of bottom-up input, and another set computing mismatches (prediction error or surprise) between predictions and evidence. This contrasts with traditional views of visual neurons operating purely as bottom-up feature detectors. In support of the predictive coding hypothesis, a recent human neuroimaging study (Egner, Monti, & Summerfield, 2010) showed that neural population responses to expected and unexpected face and house stimuli in the "fusiform face area" (FFA) could be well-described as a summation of hypothetical face-expectation and -surprise signals, but not by feature detector responses. Here, we used computer simulations to test whether these imaging data could be formally explained within the broader framework of a mathematical neural network model of associative learning (Schmajuk, Gray, & Lam, 1996). Results show that FFA responses could be fit very closely by model variables coding for conditional predictions (and their violations) of stimuli that unconditionally activate the FFA. These data document that neural population signals in the ventral visual stream that deviate from classic feature detection responses can formally be explained by associative prediction and surprise signals.

  2. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  3. A Bayesian formulation of behavioral control.

    PubMed

    Huys, Quentin J M; Dayan, Peter

    2009-12-01

    Helplessness, a belief that the world is not subject to behavioral control, has long been central to our understanding of depression, and has influenced cognitive theories, animal models and behavioral treatments. However, despite its importance, there is no fully accepted definition of helplessness or behavioral control in psychology or psychiatry, and the formal treatments in engineering appear to capture only limited aspects of the intuitive concepts. Here, we formalize controllability in terms of characteristics of prior distributions over affectively charged environments. We explore the relevance of this notion of control to reinforcement learning methods of optimising behavior in such environments and consider how apparently maladaptive beliefs can result from normative inference processes. These results are discussed with reference to depression and animal models thereof.

  4. Implementation of the nudged elastic band method in a dislocation dynamics formalism: Application to dislocation nucleation

    NASA Astrophysics Data System (ADS)

    Geslin, Pierre-Antoine; Gatti, Riccardo; Devincre, Benoit; Rodney, David

    2017-11-01

    We propose a framework to study thermally-activated processes in dislocation glide. This approach is based on an implementation of the nudged elastic band method in a nodal mesoscale dislocation dynamics formalism. Special care is paid to develop a variational formulation to ensure convergence to well-defined minimum energy paths. We also propose a methodology to rigorously parametrize the model on atomistic data, including elastic, core and stacking fault contributions. To assess the validity of the model, we investigate the homogeneous nucleation of partial dislocation loops in aluminum, recovering the activation energies and loop shapes obtained with atomistic calculations and extending these calculations to lower applied stresses. The present method is also applied to heterogeneous nucleation on spherical inclusions.

  5. Modeling Structure-Function Relationships in Synthetic DNA Sequences using Attribute Grammars

    PubMed Central

    Cai, Yizhi; Lux, Matthew W.; Adam, Laura; Peccoud, Jean

    2009-01-01

    Recognizing that certain biological functions can be associated with specific DNA sequences has led various fields of biology to adopt the notion of the genetic part. This concept provides a finer level of granularity than the traditional notion of the gene. However, a method of formally relating how a set of parts relates to a function has not yet emerged. Synthetic biology both demands such a formalism and provides an ideal setting for testing hypotheses about relationships between DNA sequences and phenotypes beyond the gene-centric methods used in genetics. Attribute grammars are used in computer science to translate the text of a program source code into the computational operations it represents. By associating attributes with parts, modifying the value of these attributes using rules that describe the structure of DNA sequences, and using a multi-pass compilation process, it is possible to translate DNA sequences into molecular interaction network models. These capabilities are illustrated by simple example grammars expressing how gene expression rates are dependent upon single or multiple parts. The translation process is validated by systematically generating, translating, and simulating the phenotype of all the sequences in the design space generated by a small library of genetic parts. Attribute grammars represent a flexible framework connecting parts with models of biological function. They will be instrumental for building mathematical models of libraries of genetic constructs synthesized to characterize the function of genetic parts. This formalism is also expected to provide a solid foundation for the development of computer assisted design applications for synthetic biology. PMID:19816554

  6. Beware the tail that wags the dog: informal and formal models in biology

    PubMed Central

    Gunawardena, Jeremy

    2014-01-01

    Informal models have always been used in biology to guide thinking and devise experiments. In recent years, formal mathematical models have also been widely introduced. It is sometimes suggested that formal models are inherently superior to informal ones and that biology should develop along the lines of physics or economics by replacing the latter with the former. Here I suggest to the contrary that progress in biology requires a better integration of the formal with the informal. PMID:25368417

  7. Modular Knowledge Representation and Reasoning in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Serafini, Luciano; Homola, Martin

    Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.

  8. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  9. Are quantum-mechanical-like models possible, or necessary, outside quantum physics?

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2014-12-01

    This article examines some experimental conditions that invite and possibly require recourse to quantum-mechanical-like mathematical models (QMLMs), models based on the key mathematical features of quantum mechanics, in scientific fields outside physics, such as biology, cognitive psychology, or economics. In particular, I consider whether the following two correlative features of quantum phenomena that were decisive for establishing the mathematical formalism of quantum mechanics play similarly important roles in QMLMs elsewhere. The first is the individuality and discreteness of quantum phenomena, and the second is the irreducibly probabilistic nature of our predictions concerning them, coupled to the particular character of the probabilities involved, as different from the character of probabilities found in classical physics. I also argue that these features could be interpreted in terms of a particular form of epistemology that suspends and even precludes a causal and, in the first place, realist description of quantum objects and processes. This epistemology limits the descriptive capacity of quantum theory to the description, classical in nature, of the observed quantum phenomena manifested in measuring instruments. Quantum mechanics itself only provides descriptions, probabilistic in nature, concerning numerical data pertaining to such phenomena, without offering a physical description of quantum objects and processes. While QMLMs share their use of the quantum-mechanical or analogous mathematical formalism, they may differ by the roles, if any, the two features in question play in them and by different ways of interpreting the phenomena they considered and this formalism itself. This article will address those differences as well.

  10. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  11. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  12. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  13. A hybrid formalism of aerosol gas phase interaction for 3-D global models

    NASA Astrophysics Data System (ADS)

    Benduhn, F.

    2009-04-01

    Aerosol chemical composition is a relevant factor to the global climate system with respect to both atmospheric chemistry and the aerosol direct and indirect effects. Aerosol chemical composition determines the capacity of aerosol particles to act as cloud condensation nuclei both explicitly via particle size and implicitly via the aerosol hygroscopic property. Due to the primary role of clouds in the climate system and the sensitivity of cloud formation and radiative properties to the cloud droplet number it is necessary to determine with accuracy the chemical composition of the aerosol. Dissolution, although a formally fairly well known process, may be subject to numerically prohibitive properties that result from the chemical interaction of the species engaged. So-far approaches to model the dissolution of inorganics into the aerosol liquid phase in the framework of a 3-D global model were based on an equilibrium, transient or hybrid equilibrium-transient approach. All of these methods present the disadvantage of a priori assumptions with respect to the mechanism and/or are numerically not manageable in the context of a global climate system model. In this paper a new hybrid formalism to aerosol gas phase interaction is presented within the framework of the H2SO4/HNO3/HCl/NH3 system and a modal approach of aerosol size discretisation. The formalism is distinct from prior hybrid approaches in as much as no a priori assumption on the nature of the regime a particular aerosol mode is in is made. Whether a particular mode is set to be in the equilibrium or the transitory regime is continuously determined during each time increment against relevant criteria considering the estimated equilibration time interval and the interdependence of the aerosol modes relative to the partitioning of the dissolving species. Doing this the aerosol composition range of numerical stiffness due to species interaction during transient dissolution is effectively eluded, and the numerical expense of dissolution in the transient regime is reduced through the minimisation of the number of modes in this regime and a larger time step. Containment of the numerical expense of the modes in the equilibrium regime is ensured through the usage of either an analytical equilibrium solver that requires iteration among the equilibrium modes, or a simple numerical solver based on a differential approach that requires iteration among the chemical species. Both equilibrium solvers require iteration over the water content and the activity coefficients. Decision for using either one or the other solver is made upon the consideration of the actual equilibrating mechanism, either chemical interaction or gas phase partial pressure variation, respectively. The formalism should thus ally appropriate process simplification resulting in reasonable computation time to a high degree of real process conformity as it is ensured by a transitory representation of dissolution. The resulting effectiveness and limits of the formalism are illustrated with numerical examples.

  14. Fractional derivatives in the diffusion process in heterogeneous systems: The case of transdermal patches.

    PubMed

    Caputo, Michele; Cametti, Cesare

    2017-09-01

    In this note, we present a simple mathematical model of drug delivery through transdermal patches by introducing a memory formalism in the classical Fick diffusion equation based on the fractional derivative. This approach is developed in the case of a medicated adhesive patch placed on the skin to deliver a time released dose of medication through the skin towards the bloodstream.The main resistance to drug transport across the skin resides in the diffusion through its outermost layer (the stratum corneum). Due to the complicated architecture of this region, a model based on a constant diffusivity in a steady-state condition results in too simplistic assumptions and more refined models are required.The introduction of a memory formalism in the diffusion process, where diffusion parameters depend at a certain time or position on what happens at preceeding times, meets this requirement and allows a significantly better description of the experimental results.The present model may be useful not only for analyzing the rate of skin permeation but also for predicting the drug concentration after transdermal drug delivery depending on the diffusion characteristics of the patch (its thickness and pseudo-diffusion coefficient). Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Multi-level and hybrid modelling approaches for systems biology.

    PubMed

    Bardini, R; Politano, G; Benso, A; Di Carlo, S

    2017-01-01

    During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.

  16. An intraorganizational model for developing and spreading quality improvement innovations.

    PubMed

    Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J

    Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.

  17. An intraorganizational model for developing and spreading quality improvement innovations

    PubMed Central

    Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.

    2017-01-01

    Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788

  18. Xenon Formal Security Policy Model

    DTIC Science & Technology

    2007-08-14

    munication primitives such as locks or semaphores , machine instruction results, hypercall results, traps, and interrupts. For an informal example...communicated on the corresponding side of the parallel oper- ator. Events that are in X ∪ Y are synchronized over the two processes. So if we define

  19. Modeling Violent Non-State Actors: A Summary of Concepts and Methods

    DTIC Science & Technology

    2004-11-01

    charts, leadership, rules, formal communications and process efficiency to name a few. While a useful aspect of organizational diagnosis , this...and Arie Shirom, Organizational Diagnosis and Assessment: Bridging Theory and Practice (Thousand Oaks, CA: Sage Publications, 1999), 44. 9 Katz and

  20. Design of high reliability organizations in health care.

    PubMed

    Carroll, J S; Rudolph, J W

    2006-12-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.

  1. An Approach to Verification and Validation of a Reliable Multicasting Protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1994-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.

  2. An approach to verification and validation of a reliable multicasting protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.

  3. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  4. Prescriptive models to support decision making in genetics.

    PubMed

    Pauker, S G; Pauker, S P

    1987-01-01

    Formal prescriptive models can help patients and clinicians better understand the risks and uncertainties they face and better formulate well-reasoned decisions. Using Bayes rule, the clinician can interpret pedigrees, historical data, physical findings and laboratory data, providing individualized probabilities of various diagnoses and outcomes of pregnancy. With the advent of screening programs for genetic disease, it becomes increasingly important to consider the prior probabilities of disease when interpreting an abnormal screening test result. Decision trees provide a convenient formalism for structuring diagnostic, therapeutic and reproductive decisions; such trees can also enhance communication between clinicians and patients. Utility theory provides a mechanism for patients to understand the choices they face and to communicate their attitudes about potential reproductive outcomes in a manner which encourages the integration of those attitudes into appropriate decisions. Using a decision tree, the relevant probabilities and the patients' utilities, physicians can estimate the relative worth of various medical and reproductive options by calculating the expected utility of each. By performing relevant sensitivity analyses, clinicians and patients can understand the impact of various soft data, including the patients' attitudes toward various health outcomes, on the decision making process. Formal clinical decision analytic models can provide deeper understanding and improved decision making in clinical genetics.

  5. Anti-gravity with present technology - Implementation and theoretical foundation

    NASA Astrophysics Data System (ADS)

    Alzofon, F. E.

    1981-07-01

    This paper proposes a semi-empirical model of the processes leading to the gravitational field based on accepted features of subatomic processes. Through an analogy with methods of cryogenics, a method of decreasing (or increasing) the gravitational force on a vehicle, using presently-known technology, is suggested. Various ways of ultilizing this effect in vehicle propulsion are described. A unified field theory is then detailed which provides a more formal foundation for the gravitational field model first introduced. In distinction to the general theory of relativity, it features physical processes which generate the gravitational field.

  6. Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.

    1991-01-01

    An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.

  7. Radiative transfer calculated from a Markov chain formalism

    NASA Technical Reports Server (NTRS)

    Esposito, L. W.; House, L. L.

    1978-01-01

    The theory of Markov chains is used to formulate the radiative transport problem in a general way by modeling the successive interactions of a photon as a stochastic process. Under the minimal requirement that the stochastic process is a Markov chain, the determination of the diffuse reflection or transmission from a scattering atmosphere is equivalent to the solution of a system of linear equations. This treatment is mathematically equivalent to, and thus has many of the advantages of, Monte Carlo methods, but can be considerably more rapid than Monte Carlo algorithms for numerical calculations in particular applications. We have verified the speed and accuracy of this formalism for the standard problem of finding the intensity of scattered light from a homogeneous plane-parallel atmosphere with an arbitrary phase function for scattering. Accurate results over a wide range of parameters were obtained with computation times comparable to those of a standard 'doubling' routine. The generality of this formalism thus allows fast, direct solutions to problems that were previously soluble only by Monte Carlo methods. Some comparisons are made with respect to integral equation methods.

  8. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    ERIC Educational Resources Information Center

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  9. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  10. Colloquium: Modeling the dynamics of multicellular systems: Application to tissue engineering

    NASA Astrophysics Data System (ADS)

    Kosztin, Ioan; Vunjak-Novakovic, Gordana; Forgacs, Gabor

    2012-10-01

    Tissue engineering is a rapidly evolving discipline that aims at building functional tissues to improve or replace damaged ones. To be successful in such an endeavor, ideally, the engineering of tissues should be based on the principles of developmental biology. Recent progress in developmental biology suggests that the formation of tissues from the composing cells is often guided by physical laws. Here a comprehensive computational-theoretical formalism is presented that is based on experimental input and incorporates biomechanical principles of developmental biology. The formalism is described and it is shown that it correctly reproduces and predicts the quantitative characteristics of the fundamental early developmental process of tissue fusion. Based on this finding, the formalism is then used toward the optimization of the fabrication of tubular multicellular constructs, such as a vascular graft, by bioprinting, a novel tissue engineering technology.

  11. What is the right formalism to search for resonances?

    NASA Astrophysics Data System (ADS)

    Mikhasenko, M.; Pilloni, A.; Nys, J.; Albaladejo, M.; Fernández-Ramírez, C.; Jackura, A.; Mathieu, V.; Sherrill, N.; Skwarnicki, T.; Szczepaniak, A. P.

    2018-03-01

    Hadron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. Hereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B→ ψ π K and B→ \\bar{D}π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.

  12. Enhancing Formal E-Learning with Edutainment on Social Networks

    ERIC Educational Resources Information Center

    Labus, A.; Despotovic-Zrakic, M.; Radenkovic, B.; Bogdanovic, Z.; Radenkovic, M.

    2015-01-01

    This paper reports on the investigation of the possibilities of enhancing the formal e-learning process by harnessing the potential of informal game-based learning on social networks. The goal of the research is to improve the outcomes of the formal learning process through the design and implementation of an educational game on a social network…

  13. A Hilbert Space Representation of Generalized Observables and Measurement Processes in the ESR Model

    NASA Astrophysics Data System (ADS)

    Sozzo, Sandro; Garola, Claudio

    2010-12-01

    The extended semantic realism ( ESR) model recently worked out by one of the authors embodies the mathematical formalism of standard (Hilbert space) quantum mechanics in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here a Hilbert space representation of the generalized observables introduced by the ESR model that satisfy a simple physical condition, propose a generalization of the projection postulate, and suggest a possible mathematical description of the measurement process in terms of evolution of the compound system made up of the measured system and the measuring apparatus.

  14. Structural and process factors affecting the implementation of antimicrobial resistance prevention and control strategies in U.S. hospitals.

    PubMed

    Chou, Ann F; Yano, Elizabeth M; McCoy, Kimberly D; Willis, Deanna R; Doebbeling, Bradley N

    2008-01-01

    To address increases in the incidence of infection with antimicrobial-resistant pathogens, the National Foundation for Infectious Diseases and Centers for Disease Control and Prevention proposed two sets of strategies to (a) optimize antibiotic use and (b) prevent the spread of antimicrobial resistance and control transmission. However, little is known about the implementation of these strategies. Our objective is to explore organizational structural and process factors that facilitate the implementation of National Foundation for Infectious Diseases/Centers for Disease Control and Prevention strategies in U.S. hospitals. We surveyed 448 infection control professionals from a national sample of hospitals. Clinically anchored in the Donabedian model that defines quality in terms of structural and process factors, with the structural domain further informed by a contingency approach, we modeled the degree to which National Foundation for Infectious Diseases and Centers for Disease Control and Prevention strategies were implemented as a function of formalization and standardization of protocols, centralization of decision-making hierarchy, information technology capabilities, culture, communication mechanisms, and interdepartmental coordination, controlling for hospital characteristics. Formalization, standardization, centralization, institutional culture, provider-management communication, and information technology use were associated with optimal antibiotic use and enhanced implementation of strategies that prevent and control antimicrobial resistance spread (all p < .001). However, interdepartmental coordination for patient care was inversely related with antibiotic use in contrast to antimicrobial resistance spread prevention and control (p < .0001). Formalization and standardization may eliminate staff role conflict, whereas centralized authority may minimize ambiguity. Culture and communication likely promote internal trust, whereas information technology use helps integrate and support these organizational processes. These findings suggest concrete strategies for evaluating current capabilities to implement effective practices and foster and sustain a culture of patient safety.

  15. Indigenous Knowledge and Education from the Quechua Community to School: Beyond the Formal/Non-Formal Dichotomy

    ERIC Educational Resources Information Center

    Sumida Huaman, Elizabeth; Valdiviezo, Laura Alicia

    2014-01-01

    In this article, we propose to approach Indigenous education beyond the formal/non-formal dichotomy. We argue that there is a critical need to conscientiously include Indigenous knowledge in education processes from the school to the community; particularly, when formal systems exclude Indigenous cultures and languages. Based on ethnographic…

  16. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  17. Stress Process Model for Individuals with Dementia

    ERIC Educational Resources Information Center

    Judge, Katherine S.; Menne, Heather L.; Whitlatch, Carol J.

    2010-01-01

    Purpose: Individuals with dementia (IWDs) face particular challenges in managing and coping with their illness. The experience of dementia may be affected by the etiology, stage, and severity of symptoms, preexisting and related chronic conditions, and available informal and formal supportive services. Although several studies have examined…

  18. Tacit Models, Treasured Intuitions and the Discrete--Continuous Interplay

    ERIC Educational Resources Information Center

    Kidron, Ivy

    2011-01-01

    We explore conditions for productive synthesis between formal reasoning and intuitive representations through analysis of college students' understanding of the limit concept in the definition of the derivative. In particular, we compare and contrast cognitive processes that accompany different manifestations of persistence of intuitions and tacit…

  19. Waste management of printed wiring boards: a life cycle assessment of the metals recycling chain from liberation through refining.

    PubMed

    Xue, Mianqiang; Kendall, Alissa; Xu, Zhenming; Schoenung, Julie M

    2015-01-20

    Due to economic and societal reasons, informal activities including open burning, backyard recycling, and landfill are still the prevailing methods used for electronic waste treatment in developing countries. Great efforts have been made, especially in China, to promote formal approaches for electronic waste management by enacting laws, developing green recycling technologies, initiating pilot programs, etc. The formal recycling process can, however, engender environmental impact and resource consumption, although information on the environmental loads and resource consumption is currently limited. To quantitatively assess the environmental impact of the processes in a formal printed wiring board (PWB) recycling chain, life cycle assessment (LCA) was applied to a formal recycling chain that includes the steps from waste liberation through materials refining. The metal leaching in the refining stage was identified as a critical process, posing most of the environmental impact in the recycling chain. Global warming potential was the most significant environmental impact category after normalization and weighting, followed by fossil abiotic depletion potential, and marine aquatic eco-toxicity potential. Scenario modeling results showed that variations in the power source and chemical reagents consumption had the greatest influence on the environmental performance. The environmental impact from transportation used for PWB collection was also evaluated. The results were further compared to conventional primary metals production processes, highlighting the environmental benefit of metal recycling from waste PWBs. Optimizing the collection mode, increasing the precious metals recovery efficiency in the beneficiation stage and decreasing the chemical reagents consumption in the refining stage by effective materials liberation and separation are proposed as potential improvement strategies to make the recycling chain more environmentally friendly. The LCA results provide environmental information for the improvement of future integrated technologies and electronic waste management.

  20. Transactions in domain-specific information systems

    NASA Astrophysics Data System (ADS)

    Zacek, Jaroslav

    2017-07-01

    Substantial number of the current information system (IS) implementations is based on transaction approach. In addition, most of the implementations are domain-specific (e.g. accounting IS, resource planning IS). Therefore, we have to have a generic transaction model to build and verify domain-specific IS. The paper proposes a new transaction model for domain-specific ontologies. This model is based on value oriented business process modelling technique. The transaction model is formalized by the Petri Net theory. First part of the paper presents common business processes and analyses related to business process modeling. Second part defines the transactional model delimited by REA enterprise ontology paradigm and introduces states of the generic transaction model. The generic model proposal is defined and visualized by the Petri Net modelling tool. Third part shows application of the generic transaction model. Last part of the paper concludes results and discusses a practical usability of the generic transaction model.

  1. Modeling treatment of ischemic heart disease with partially observable Markov decision processes.

    PubMed

    Hauskrecht, M; Fraser, H

    1998-01-01

    Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.

  2. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  3. Enhancing the Dependability of Complex Missions Through Automated Analysis

    DTIC Science & Technology

    2013-09-01

    triangular or self - referential relationships. The Semantic Web Rule Language (SWRL)—a W3C-approved OWL extension—addresses some of these limitations by...SWRL extends OWL with Horn-like rules that can model complex relational structures and self - referential relationships; Prolog extends OWL+SWRL with the...8]. Additionally, multi-agent model checking has been used to verify OWL-S process models [9]. OWL is a powerful knowledge representation formalism

  4. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  5. On formally integrating science and policy: walking the walk

    USGS Publications Warehouse

    Nichols, James D.; Johnson, Fred A.; Williams, Byron K.; Boomer, G. Scott

    2015-01-01

    The contribution of science to the development and implementation of policy is typically neither direct nor transparent.  In 1995, the U.S. Fish and Wildlife Service (FWS) made a decision that was unprecedented in natural resource management, turning to an unused and unproven decision process to carry out trust responsibilities mandated by an international treaty.  The decision process was adopted for the establishment of annual sport hunting regulations for the most economically important duck population in North America, the 6 to 11 million mallards Anas platyrhynchos breeding in the mid-continent region of north-central United States and central Canada.  The key idea underlying the adopted decision process was to formally embed within it a scientific process designed to reduce uncertainty (learn) and thus make better decisions in the future.  The scientific process entails use of models to develop predictions of competing hypotheses about system response to the selected action at each decision point.  These prediction not only are used to select the optimal management action, but also are compared with the subsequent estimates of system state variables, providing evidence for modifying degrees of confidence in, and hence relative influence of, these models at the next decision point.  Science and learning in one step are formally and directly incorporated into the next decision, contrasting with the usual ad hoc and indirect use of scientific results in policy development and decision-making.  Application of this approach over the last 20 years has led to a substantial reduction in uncertainty, as well as to an increase in transparency and defensibility of annual decisions and a decrease in the contentiousness of the decision process.  As resource managers are faced with increased uncertainty associated with various components of global change, this approach provides a roadmap for the future scientific management of natural resources.  

  6. Formal Modeling of Multi-Agent Systems using the Pi-Calculus and Epistemic Logic

    NASA Technical Reports Server (NTRS)

    Rorie, Toinette; Esterline, Albert

    1998-01-01

    Multi-agent systems have become important recently in computer science, especially in artificial intelligence (AI). We allow a broad sense of agent, but require at least that an agent has some measure of autonomy and interacts with other agents via some kind of agent communication language. We are concerned in this paper with formal modeling of multi-agent systems, with emphasis on communication. We propose for this purpose to use the pi-calculus, an extension of the process algebra CCS. Although the literature on the pi-calculus refers to agents, the term is used there in the sense of a process in general. It is our contention, however, that viewing agents in the AI sense as agents in the pi-calculus sense affords significant formal insight. One formalism that has been applied to agents in the AI sense is epistemic logic, the logic of knowledge. The success of epistemic logic in computer science in general has come in large part from its ability to handle concepts of knowledge that apply to groups. We maintain that the pi-calculus affords a natural yet rigorous means by which groups that are significant to epistemic logic may be identified, encapsulated, structured into hierarchies, and restructured in a principled way. This paper is organized as follows: Section 2 introduces the pi-calculus; Section 3 takes a scenario from the classical paper on agent-oriented programming [Sh93] and translates it into a very simple subset of the n-calculus; Section 4 then shows how more sophisticated features of the pi-calculus may bc brought into play; Section 5 discusses how the pi-calculus may be used to define groups for epistemic logic; and Section 6 is the conclusion.

  7. Endorsement of formal leaders: an integrative model.

    PubMed

    Michener, H A; Lawler, E J

    1975-02-01

    This experiment develops an integrative, path-analytic model for the endorsement accorded formal leaders. The model contains four independent variables reflecting aspects of group structure (i.e., group success-failure, the payoff distribution, the degree of support by others members for the leader, and the vulnerability of the leader). Also included are two intervening variables reflecting perceptual processes (attributed competence and attributed fairness), and one dependent variable endorsement). The results indicate that endorsement is greater when the group's success is high, when the payoff distribution is flat rather than hierarchial, and when the leader is not vulnerable to removal from office. Other support had no significant impact on endorsement. Analyses further demonstrate that the effect of success-failure on endorsement is mediated by attributed competence, while the effect of the payoff distributed is mediated by attributed fairness. These results suggest that moral and task evaluations are distinct bases of endorsement.

  8. How to reach linguistic consensus: a proof of convergence for the naming game.

    PubMed

    De Vylder, Bart; Tuyls, Karl

    2006-10-21

    In this paper we introduce a mathematical model of naming games. Naming games have been widely used within research on the origins and evolution of language. Despite the many interesting empirical results these studies have produced, most of this research lacks a formal elucidating theory. In this paper we show how a population of agents can reach linguistic consensus, i.e. learn to use one common language to communicate with one another. Our approach differs from existing formal work in two important ways: one, we relax the too strong assumption that an agent samples infinitely often during each time interval. This assumption is usually made to guarantee convergence of an empirical learning process to a deterministic dynamical system. Two, we provide a proof that under these new realistic conditions, our model converges to a common language for the entire population of agents. Finally the model is experimentally validated.

  9. A Software Technology Transition Entropy Based Engineering Model

    DTIC Science & Technology

    2002-03-01

    Systems Basics, p273). (Prigogine 1997 p81). It is not the place of this research to provide a mathematical formalism with theorems and lemmas. Rather...science). The ancient philosophers, 27 Pythagoras , Protagoras, Socrates, and Plato start the first discourse (the message) that has continued...unpacking of the technology "message" from Pythagoras . This process is characterized by accumulation learning, modeled by learning curves in

  10. Relationship between individual characteristics, neighbourhood contexts and help-seeking intentions for mental illness

    PubMed Central

    Suka, Machi; Yamauchi, Takashi; Sugimori, Hiroki

    2015-01-01

    Objective Encouraging help-seeking for mental illness is essential for prevention of suicide. This study examined the relationship between individual characteristics, neighbourhood contexts and help-seeking intentions for mental illness for the purpose of elucidating the role of neighbourhood in the help-seeking process. Design, setting and participants A cross-sectional web-based survey was conducted among Japanese adults aged 20–59 years in June 2014. Eligible respondents who did not have a serious health condition were included in this study (n=3308). Main outcome measures Participants were asked how likely they would be to seek help from someone close to them (informal help) and medical professionals (formal help), respectively, if they were suffering from serious mental illness. Path analysis with structural equation modelling was performed to represent plausible connections between individual characteristics, neighbourhood contexts, and informal and formal help-seeking intentions. Results The acceptable fitting model indicated that those who had a tendency to consult about everyday affairs were significantly more likely to express an informal help-seeking intention that was directly associated with a formal help-seeking intention. Those living in a communicative neighbourhood, where neighbours say hello whenever they pass each other, were significantly more likely to express informal and formal help-seeking intentions. Those living in a supportive neighbourhood, where neighbours work together to solve neighbourhood problems, were significantly more likely to express an informal help-seeking intention. Adequate health literacy was directly associated with informal and formal help-seeking intentions, along with having an indirect effect on the formal help-seeking intention through developed positive perception of professional help. Conclusions The results of this study bear out the hypothesis that neighbourhood context contributes to help-seeking intentions for mental illness. Living in a neighbourhood with a communicative atmosphere and having adequate health literacy were acknowledged as possible facilitating factors for informal and formal help-seeking for mental illness. PMID:26264273

  11. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  12. Chemical vapor deposition modeling for high temperature materials

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.

    1992-01-01

    The formalism for the accurate modeling of chemical vapor deposition (CVD) processes has matured based on the well established principles of transport phenomena and chemical kinetics in the gas phase and on surfaces. The utility and limitations of such models are discussed in practical applications for high temperature structural materials. Attention is drawn to the complexities and uncertainties in chemical kinetics. Traditional approaches based on only equilibrium thermochemistry and/or transport phenomena are defended as useful tools, within their validity, for engineering purposes. The role of modeling is discussed within the context of establishing the link between CVD process parameters and material microstructures/properties. It is argued that CVD modeling is an essential part of designing CVD equipment and controlling/optimizing CVD processes for the production and/or coating of high performance structural materials.

  13. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  14. Multi-Attribute Tradespace Exploration in Space System Design

    NASA Astrophysics Data System (ADS)

    Ross, A. M.; Hastings, D. E.

    2002-01-01

    The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.

  15. Electrodiffusive Model for Astrocytic and Neuronal Ion Concentration Dynamics

    PubMed Central

    Halnes, Geir; Østby, Ivar; Pettersen, Klas H.; Omholt, Stig W.; Einevoll, Gaute T.

    2013-01-01

    The cable equation is a proper framework for modeling electrical neural signalling that takes place at a timescale at which the ionic concentrations vary little. However, in neural tissue there are also key dynamic processes that occur at longer timescales. For example, endured periods of intense neural signaling may cause the local extracellular K+-concentration to increase by several millimolars. The clearance of this excess K+ depends partly on diffusion in the extracellular space, partly on local uptake by astrocytes, and partly on intracellular transport (spatial buffering) within astrocytes. These processes, that take place at the time scale of seconds, demand a mathematical description able to account for the spatiotemporal variations in ion concentrations as well as the subsequent effects of these variations on the membrane potential. Here, we present a general electrodiffusive formalism for modeling of ion concentration dynamics in a one-dimensional geometry, including both the intra- and extracellular domains. Based on the Nernst-Planck equations, this formalism ensures that the membrane potential and ion concentrations are in consistency, it ensures global particle/charge conservation and it accounts for diffusion and concentration dependent variations in resistivity. We apply the formalism to a model of astrocytes exchanging ions with the extracellular space. The simulations show that K+-removal from high-concentration regions is driven by a local depolarization of the astrocyte membrane, which concertedly (i) increases the local astrocytic uptake of K+, (ii) suppresses extracellular transport of K+, (iii) increases axial transport of K+ within astrocytes, and (iv) facilitates astrocytic relase of K+ in regions where the extracellular concentration is low. Together, these mechanisms seem to provide a robust regulatory scheme for shielding the extracellular space from excess K+. PMID:24367247

  16. Main factors in E-Learning for the Equivalency Education Program (E-LEEP)

    NASA Astrophysics Data System (ADS)

    Yel, M. B.; Sfenrianto

    2018-03-01

    There is a tremendous learning gap between formal education and non-formal education. E-Learning can facilitate non-formal education learners in improving the learning process. In this study, we present the main factors behind the E-learning for the Equivalency Education Program (E-LEEP) initiative in Indonesia. There are four main factors proposed, namely: standardization, learning materials, learning process, and learners’ characteristics. Each factor supports each other to achieve the learning process of E-LEEP in Indonesia. Although not yet proven, the E-learning should be developed followed the main factors for the non-formal education. This is because those factors can improve the quality of E-Learning for the Equivalency Education Program.

  17. A stylistic classification of Russian-language texts based on the random walk model

    NASA Astrophysics Data System (ADS)

    Kramarenko, A. A.; Nekrasov, K. A.; Filimonov, V. V.; Zhivoderov, A. A.; Amieva, A. A.

    2017-09-01

    A formal approach to text analysis is suggested that is based on the random walk model. The frequencies and reciprocal positions of the vowel letters are matched up by a process of quasi-particle migration. Statistically significant difference in the migration parameters for the texts of different functional styles is found. Thus, a possibility of classification of texts using the suggested method is demonstrated. Five groups of the texts are singled out that can be distinguished from one another by the parameters of the quasi-particle migration process.

  18. African Cultural Traditions and Modernization: A Reaffirmation.

    ERIC Educational Resources Information Center

    Boateng, Felix A.

    1978-01-01

    The viability of African cultural traditions and their role in modernization and nation-building in Africa are examined. Social and political organization and formal education are discussed in relation to the process of modernization. Although Africa may utilize Western models of development, Westernization and modernization are not synonymous.…

  19. A Study of Hierarchical Classification in Concrete and Formal Thought.

    ERIC Educational Resources Information Center

    Lowell, Walter E.

    This researcher investigated the relationship of hierarchical classification processes in subjects categorized as to developmental level as defined by Piaget's theory, and explored the validity of the hierarchical model and test used in the study. A hierarchical classification test and a battery of four Piaget-type tasks were administered…

  20. Adaptive Neurons For Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul

    1990-01-01

    Training time decreases dramatically. In improved mathematical model of neural-network processor, temperature of neurons (in addition to connection strengths, also called weights, of synapses) varied during supervised-learning phase of operation according to mathematical formalism and not heuristic rule. Evidence that biological neural networks also process information at neuronal level.

  1. Evaluation of Formal Training Programmes in Greek Organisations

    ERIC Educational Resources Information Center

    Diamantidis, Anastasios D.; Chatzoglou, Prodromos D.

    2012-01-01

    Purpose: The purpose of the paper is to highlight the training factors that mostly affect trainees' perception of learning and training usefulness. Design/methodology/approach: A new research model is proposed exploring the relationships between a trainer's performance, training programme components, outcomes of the learning process and training…

  2. Comparing OECD Educational Models through the Prism of PISA

    ERIC Educational Resources Information Center

    Bulle, Nathalie

    2011-01-01

    The PISA survey influences educational policies through an international competitive process which is not wholly rationally-oriented. Firstly, PISA league tables act normatively upon the definition of formal educational aims while the survey tests cannot evaluate the educational systems' relative strengths with regards to such aims. We argue that…

  3. New methods for modeling stream temperature using high resolution LiDAR, solar radiation analysis and flow accumulated values

    EPA Science Inventory

    In-stream temperature directly effects a variety of biotic organisms, communities and processes. Changes in stream temperature can render formally suitable habitat unsuitable for aquatic organisms, particularly native cold water species that are not able to adjust. In order to an...

  4. Overview of a Linguistic Theory of Design. AI Memo 383A.

    ERIC Educational Resources Information Center

    Miller, Mark L.; Goldstein, Ira P.

    The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…

  5. Improving the All-Hazards Homeland Security Enterprise Through the Use of an Emergency Management Intelligence Model

    DTIC Science & Technology

    2013-09-01

    Office of the Inspector General OSINT Open Source Intelligence PPD Presidential Policy Directive SIGINT Signals Intelligence SLFC State/Local Fusion...Geospatial Intelligence (GEOINT) from Geographic Information Systems (GIS), and Open Source Intelligence ( OSINT ) from Social Media. GIS is widely...and monitor make it a feasible tool to capitalize on for OSINT . A formalized EM intelligence process would help expedite the processing of such

  6. A patient workflow management system built on guidelines.

    PubMed Central

    Dazzi, L.; Fassino, C.; Saracco, R.; Quaglini, S.; Stefanelli, M.

    1997-01-01

    To provide high quality, shared, and distributed medical care, clinical and organizational issues need to be integrated. This work describes a methodology for developing a Patient Workflow Management System, based on a detailed model of both the medical work process and the organizational structure. We assume that the medical work process is represented through clinical practice guidelines, and that an ontological description of the organization is available. Thus, we developed tools 1) for acquiring the medical knowledge contained into a guideline, 2) to translate the derived formalized guideline into a computational formalism, precisely a Petri Net, 3) to maintain different representation levels. The high level representation guarantees that the Patient Workflow follows the guideline prescriptions, while the low level takes into account the specific organization characteristics and allow allocating resources for managing a specific patient in daily practice. PMID:9357606

  7. Ontology for assessment studies of human-computer-interaction in surgery.

    PubMed

    Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen

    2015-02-01

    New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks. Furthermore, it allows to acquire the structured description of the applied assessment methods within a certain surgical domain and to consider this information for own study design or to perform a comparison of different studies. The investigation model and the corresponding ontology can be used further to create new knowledge bases of HCI assessment in surgery. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Initiating Formal Requirements Specifications with Object-Oriented Models

    NASA Technical Reports Server (NTRS)

    Ampo, Yoko; Lutz, Robyn R.

    1994-01-01

    This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.

  9. Understanding Informality and Formality in Learning.

    ERIC Educational Resources Information Center

    Colley, Helen; Hodkinson, Phil; Malcolm, Janice

    2003-01-01

    Reviews definitions of and debates over distinctions among formal, informal, and nonformal learning. Outlines questions about four aspects of formality/informality with which to analyze learning situations: process, location/setting, purposes, and content. (SK)

  10. Ontology development for provenance tracing in National Climate Assessment of the US Global Change Research Program

    NASA Astrophysics Data System (ADS)

    Fu, Linyun; Ma, Xiaogang; Zheng, Jin; Goldstein, Justin; Duggan, Brian; West, Patrick; Aulenbach, Steve; Tilmes, Curt; Fox, Peter

    2014-05-01

    This poster will show how we used a case-driven iterative methodology to develop an ontology to represent the content structure and the associated provenance information in a National Climate Assessment (NCA) report of the US Global Change Research Program (USGCRP). We applied the W3C PROV-O ontology to implement a formal representation of provenance. We argue that the use case-driven, iterative development process and the application of a formal provenance ontology help efficiently incorporate domain knowledge from earth and environmental scientists in a well-structured model interoperable in the context of the Web of Data.

  11. Design of high reliability organizations in health care

    PubMed Central

    Carroll, J S; Rudolph, J W

    2006-01-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607

  12. What is the right formalism to search for resonances?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikhasenko, M.; Pilloni, A.; Nys, J.

    Hmore » adron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. ereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B → ψ π K and B → D ¯ π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.« less

  13. What is the right formalism to search for resonances?

    DOE PAGES

    Mikhasenko, M.; Pilloni, A.; Nys, J.; ...

    2018-03-17

    Hmore » adron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. ereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B → ψ π K and B → D ¯ π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.« less

  14. Unity and diversity in human language

    PubMed Central

    Fitch, W. Tecumseh

    2011-01-01

    Human language is both highly diverse—different languages have different ways of achieving the same functional goals—and easily learnable. Any language allows its users to express virtually any thought they can conceptualize. These traits render human language unique in the biological world. Understanding the biological basis of language is thus both extremely challenging and fundamentally interesting. I review the literature on linguistic diversity and language universals, suggesting that an adequate notion of ‘formal universals’ provides a promising way to understand the facts of language acquisition, offering order in the face of the diversity of human languages. Formal universals are cross-linguistic generalizations, often of an abstract or implicational nature. They derive from cognitive capacities to perceive and process particular types of structures and biological constraints upon integration of the multiple systems involved in language. Such formal universals can be understood on the model of a general solution to a set of differential equations; each language is one particular solution. An explicit formal conception of human language that embraces both considerable diversity and underlying biological unity is possible, and fully compatible with modern evolutionary theory. PMID:21199842

  15. A UML approach to process modelling of clinical practice guidelines for enactment.

    PubMed

    Knape, T; Hederman, L; Wade, V P; Gargan, M; Harris, C; Rahman, Y

    2003-01-01

    Although clinical practice guidelines (CPGs) have been suggested as a means of encapsulating best practice in evidence-based medical treatment, their usage in clinical environments has been disappointing. Criticisms of guideline representations have been that they are predominantly narrative and are difficult to incorporate into clinical information systems. This paper analyses the use of UML process modelling techniques for guideline representation and proposes the automated generation of executable guidelines using XMI. This hybrid UML-XMI approach provides flexible authoring of guideline decision and control structures whilst integrating appropriate data flow. It also uses an open XMI standard interface to allow the use of authoring tools and process control systems from multiple vendors. The paper first surveys CPG modelling formalisms followed by a brief introduction to process modelling in UMI. Furthermore, the modelling of CPGs in UML is presented leading to a case study of encoding a diabetes mellitus CPG using UML.

  16. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.

  17. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034

  18. Equilibrium relations and bipolar cognitive mapping for online analytical processing with applications in international relations and strategic decision support.

    PubMed

    Zhang, Wen-Ran

    2003-01-01

    Bipolar logic, bipolar sets, and equilibrium relations are proposed for bipolar cognitive mapping and visualization in online analytical processing (OLAP) and online analytical mining (OLAM). As cognitive models, cognitive maps (CMs) hold great potential for clustering and visualization. Due to the lack of a formal mathematical basis, however, CM-based OLAP and OLAM have not gained popularity. Compared with existing approaches, bipolar cognitive mapping has a number of advantages. First, bipolar CMs are formal logical models as well as cognitive models. Second, equilibrium relations (with polarized reflexivity, symmetry, and transitivity), as bipolar generalizations and fusions of equivalence relations, provide a theoretical basis for bipolar visualization and coordination. Third, an equilibrium relation or CM induces bipolar partitions that distinguish disjoint coalition subsets not involved in any conflict, disjoint coalition subsets involved in a conflict, disjoint conflict subsets, and disjoint harmony subsets. Finally, equilibrium energy analysis leads to harmony and stability measures for strategic decision and multiagent coordination. Thus, this work bridges a gap for CM-based clustering and visualization in OLAP and OLAM. Basic ideas are illustrated with example CMs in international relations.

  19. Possible Effects of Synaptic Imbalances on Oligodendrocyte–Axonic Interactions in Schizophrenia: A Hypothetical Model

    PubMed Central

    Mitterauer, Bernhard J.; Kofler-Westergren, Birgitta

    2011-01-01

    A model of glial–neuronal interactions is proposed that could be explanatory for the demyelination identified in brains with schizophrenia. It is based on two hypotheses: (1) that glia–neuron systems are functionally viable and important for normal brain function, and (2) that disruption of this postulated function disturbs the glial categorization function, as shown by formal analysis. According to this model, in schizophrenia receptors on astrocytes in glial–neuronal synaptic units are not functional, loosing their modulatory influence on synaptic neurotransmission. Hence, an unconstrained neurotransmission flux occurs that hyperactivates the axon and floods the cognate receptors of neurotransmitters on oligodendrocytes. The excess of neurotransmitters may have a toxic effect on oligodendrocytes and myelin, causing demyelination. In parallel, an increasing impairment of axons may disconnect neuronal networks. It is formally shown how oligodendrocytes normally categorize axonic information processing via their processes. Demyelination decomposes the oligodendrocyte–axonic system making it incapable to generate categories of information. This incoherence may be responsible for symptoms of disorganization in schizophrenia, such as thought disorder, inappropriate affect and incommunicable motor behavior. In parallel, the loss of oligodendrocytes affects gap junctions in the panglial syncytium, presumably responsible for memory impairment in schizophrenia. PMID:21647404

  20. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    NASA Astrophysics Data System (ADS)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  1. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  2. New insights into the human body iron metabolism analyzed by a Petri net based approach.

    PubMed

    Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Blazewicz, Jacek

    2009-04-01

    Iron homeostasis is one of the most important biochemical processes in the human body. Despite this fact, the process is not fully understood and until recently only rough descriptions of parts of the process could be found in the literature. Here, an extension of the recently published formal model of the main part of the process is presented. This extension consists in including all known mechanisms of hepcidin regulation. Hepcidin is a hormone synthesized in the liver which is mainly responsible for an inhibition of iron absorption in the small intestine during an inflammatory process. The model is expressed in the language of Petri net theory which allows for its relatively easy analysis and simulation.

  3. A Model of Workflow Composition for Emergency Management

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  4. Theory of earthquakes interevent times applied to financial markets

    NASA Astrophysics Data System (ADS)

    Jagielski, Maciej; Kutner, Ryszard; Sornette, Didier

    2017-10-01

    We analyze the probability density function (PDF) of waiting times between financial loss exceedances. The empirical PDFs are fitted with the self-excited Hawkes conditional Poisson process with a long power law memory kernel. The Hawkes process is the simplest extension of the Poisson process that takes into account how past events influence the occurrence of future events. By analyzing the empirical data for 15 different financial assets, we show that the formalism of the Hawkes process used for earthquakes can successfully model the PDF of interevent times between successive market losses.

  5. Strategic ambiguities in the process of consent: role of the family in decisions to forgo life-sustaining treatment for incompetent elderly patients.

    PubMed

    Tse, Chun-yan; Tao, Julia; Chun-yan, Tse

    2004-04-01

    This paper evaluates the Hong Kong approach to consent regarding the forgoing of life-sustaining treatment for incompetent elderly patients. It analyzes the contextualized approach in the Hong Kong process-based, consensus-building model, in contrast to other role-based models which emphasize the establishment of a system of formal laws and a clear locus of decisional authority. Without embracing relativism, the paper argues that the Hong Kong model offers an instructive example of how strategic ambiguities can both make good sense within particular cultural context and serve important moral goals.

  6. Model of a programmable quantum processing unit based on a quantum transistor effect

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  7. Discrete-Slots Models of Visual Working-Memory Response Times

    PubMed Central

    Donkin, Christopher; Nosofsky, Robert M.; Gold, Jason M.; Shiffrin, Richard M.

    2014-01-01

    Much recent research has aimed to establish whether visual working memory (WM) is better characterized by a limited number of discrete all-or-none slots or by a continuous sharing of memory resources. To date, however, researchers have not considered the response-time (RT) predictions of discrete-slots versus shared-resources models. To complement the past research in this field, we formalize a family of mixed-state, discrete-slots models for explaining choice and RTs in tasks of visual WM change detection. In the tasks under investigation, a small set of visual items is presented, followed by a test item in 1 of the studied positions for which a change judgment must be made. According to the models, if the studied item in that position is retained in 1 of the discrete slots, then a memory-based evidence-accumulation process determines the choice and the RT; if the studied item in that position is missing, then a guessing-based accumulation process operates. Observed RT distributions are therefore theorized to arise as probabilistic mixtures of the memory-based and guessing distributions. We formalize an analogous set of continuous shared-resources models. The model classes are tested on individual subjects with both qualitative contrasts and quantitative fits to RT-distribution data. The discrete-slots models provide much better qualitative and quantitative accounts of the RT and choice data than do the shared-resources models, although there is some evidence for “slots plus resources” when memory set size is very small. PMID:24015956

  8. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    PubMed

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. A Process Algebraic Approach to Software Architecture Design

    NASA Astrophysics Data System (ADS)

    Aldini, Alessandro; Bernardo, Marco; Corradini, Flavio

    Process algebra is a formal tool for the specification and the verification of concurrent and distributed systems. It supports compositional modeling through a set of operators able to express concepts like sequential composition, alternative composition, and parallel composition of action-based descriptions. It also supports mathematical reasoning via a two-level semantics, which formalizes the behavior of a description by means of an abstract machine obtained from the application of structural operational rules and then introduces behavioral equivalences able to relate descriptions that are syntactically different. In this chapter, we present the typical behavioral operators and operational semantic rules for a process calculus in which no notion of time, probability, or priority is associated with actions. Then, we discuss the three most studied approaches to the definition of behavioral equivalences - bisimulation, testing, and trace - and we illustrate their congruence properties, sound and complete axiomatizations, modal logic characterizations, and verification algorithms. Finally, we show how these behavioral equivalences and some of their variants are related to each other on the basis of their discriminating power.

  10. The Influence of Informal Power Structures on School Board-Teacher Union Contract Negotiations.

    ERIC Educational Resources Information Center

    Miller-Whitehead, Marie

    A study examined behaviors of participants trained in a nonadversarial model of contract negotiation, focusing on possible influences of formal and informal power structures, written and unwritten rules, and firmly entrenched adversarial behavior on the bargaining process. Participants were representatives of a district's teacher union and board…

  11. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  12. Towards Model-Driven End-User Development in CALL

    ERIC Educational Resources Information Center

    Farmer, Rod; Gruba, Paul

    2006-01-01

    The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…

  13. New methods for modeling stream temperature using high resolution LiDAR, solar radiation analysis and flow accumulated values to predict stream temperature

    EPA Science Inventory

    In-stream temperature directly effects a variety of biotic organisms, communities and processes. Changes in stream temperature can render formally suitable habitat unsuitable for aquatic organisms, particularly native cold water species that are not able to adjust. In order to...

  14. Drawing the Line: The Cultural Cartography of Utilization Recommendations for Mental Health Problems

    ERIC Educational Resources Information Center

    Olafsdottir, Sigrun; Pescosolido, Bernice A.

    2009-01-01

    In the 1990s, sociologists began to rethink the failure of utilization models to explain whether and why individuals accessed formal treatment systems. This effort focused on reconceptualizing the underlying assumptions and processes that shaped utilization patterns. While we have built a better understanding of how social networks structure…

  15. From ePortfolios to iPortfolios: The Find, Refine, Design, and Bind Model

    ERIC Educational Resources Information Center

    Foti, Sebastian; Ring, Gail L.

    2008-01-01

    During the past two decades, educational institutions around the world began formalizing the process of collecting student work as a means of showcasing student accomplishments and ultimately providing students a forum for reflecting on their accomplishments. In this article, the authors propose a redefinition of the electronic portfolio…

  16. On Synchronization Primitive Systems.

    DTIC Science & Technology

    The report studies the question: what synchronization primitive should be used to handle inter-process communication. A formal model is presented...between these synchronization primitives. Although only four synchronization primitives are compared, the general methods can be used to compare other... synchronization primitives. Moreover, in the definitions of these synchronization primitives, conditional branches are explicitly allowed. In addition

  17. Addressing Pediatric Health Concerns through School-Based Consultation

    ERIC Educational Resources Information Center

    Truscott, Stephen D.; Albritton, Kizzy

    2011-01-01

    In schools, the term "consultation" has multiple meanings. Often it is used to describe a quick, informal process of advice giving between teachers and/or school specialists. As a formal discipline, School-Based Consultation (SBC) is an indirect service delivery model that involves two or more parties working together to benefit students. Most…

  18. Transactional, Cooperative, and Communal: Relating the Structure of Engineering Engagement Programs with the Nature of Partnerships

    ERIC Educational Resources Information Center

    Thompson, Julia D.; Jesiek, Brent K.

    2017-01-01

    This paper examines how the structural features of engineering engagement programs (EEPs) are related to the nature of their service-learning partnerships. "Structure" refers to formal and informal models, processes, and operations adopted or used to describe engagement programs, while "nature" signifies the quality of…

  19. Need for Formal Specialization in Pharmacy in Canada: A Survey of Hospital Pharmacists

    PubMed Central

    Penm, Jonathan; MacKinnon, Neil J; Jorgenson, Derek; Ying, Jun; Smith, Jennifer

    2016-01-01

    Background The Blueprint for Pharmacy was a collaborative initiative involving all of the major pharmacy associations in Canada. It aimed to coordinate, facilitate, and be a catalyst for changes required to align pharmacy practice with the health care needs of Canadians. In partial fulfilment of this mandate, a needs assessment for specialist certification for pharmacists was conducted. Objective To conduct a secondary analysis of data from the needs assessment to determine the perceptions of hospital pharmacists regarding a formal certification process for pharmacist specialties in Canada. Methods A survey was developed in consultation with the Blueprint for Pharmacy Specialization Project Advisory Group and other key stakeholders. It was distributed electronically, in English and French, to Canadian pharmacists identified through national and provincial pharmacy organizations (survey period January 15 to February 12, 2015). Data for hospital pharmacists were extracted for this secondary analysis. Multivariable logistic regression analyses were conducted to characterize those respondents who supported the certification process and those intending to become certified if a Canadian process were introduced. Results A total of 640 responses were received from hospital pharmacists. Nearly 85% of the respondents (543/640 [84.8%]) supported a formal certification process for pharmacist specialization, and more than 70% (249/349 [71.3%]) indicated their intention to obtain specialty certification if a Canadian process were introduced. Respondents believed that the main barriers to developing such a system were lack of reimbursement models, the time required, and lack of public awareness of pharmacist specialties. They felt that the most important factors for an optimal certification process were a consistent definition of pharmacist specialty practice and consistent recognition of pharmacist specialty practice across Canada. Multiple regression analysis showed that female respondents were more likely to support a formal certification process (odds ratio [OR] 2.6, 95% confidence interval [CI] 1.2–5.7). Also, those who already specialized in pharmacotherapy were more likely to support mandatory certification (OR 2.6, 95% CI 1.1–6.1). Conclusions Hospital pharmacists who responded to this survey overwhelmingly supported certification for pharmacist specialization in Canada. Questions remain about the feasibility of establishing a pharmacist specialization system in Canada. PMID:27826153

  20. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  1. The Domain-Specific Software Architecture Program

    DTIC Science & Technology

    1992-06-01

    Kang, K.C; Cohen, S.C: Jess, J.A; Novak, W.E; Peterson, A.S. Feature- Oriented Domain Analysis ( FODA ) Feasibility Study. (CMU/SEI-90-TR-21, ADA235785...perspective of a con- trols engineer solving a problem using an iterative process of simulation and analysis . The CMU/SEI-92-SR-9 1 I ~math AnalysislP...for schedulability analysis and Markov processes for the determination of reliability. Software architectures are derived from these formal models. ORA

  2. A general U-block model-based design procedure for nonlinear polynomial control systems

    NASA Astrophysics Data System (ADS)

    Zhu, Q. M.; Zhao, D. Y.; Zhang, Jianhua

    2016-10-01

    The proposition of U-model concept (in terms of 'providing concise and applicable solutions for complex problems') and a corresponding basic U-control design algorithm was originated in the first author's PhD thesis. The term of U-model appeared (not rigorously defined) for the first time in the first author's other journal paper, which established a framework for using linear polynomial control system design approaches to design nonlinear polynomial control systems (in brief, linear polynomial approaches → nonlinear polynomial plants). This paper represents the next milestone work - using linear state-space approaches to design nonlinear polynomial control systems (in brief, linear state-space approaches → nonlinear polynomial plants). The overall aim of the study is to establish a framework, defined as the U-block model, which provides a generic prototype for using linear state-space-based approaches to design the control systems with smooth nonlinear plants/processes described by polynomial models. For analysing the feasibility and effectiveness, sliding mode control design approach is selected as an exemplary case study. Numerical simulation studies provide a user-friendly step-by-step procedure for the readers/users with interest in their ad hoc applications. In formality, this is the first paper to present the U-model-oriented control system design in a formal way and to study the associated properties and theorems. The previous publications, in the main, have been algorithm-based studies and simulation demonstrations. In some sense, this paper can be treated as a landmark for the U-model-based research from intuitive/heuristic stage to rigour/formal/comprehensive studies.

  3. Validity of Assessment and Recognition of Non-Formal and Informal Learning Achievements in Higher Education

    ERIC Educational Resources Information Center

    Kaminskiene, Lina; Stasiunaitiene, Egle

    2013-01-01

    The article identifies the validity of assessment of non-formal and informal learning achievements (NILA) as one of the key factors for encouraging further development of the process of assessing and recognising non-formal and informal learning achievements in higher education. The authors analyse why the recognition of non-formal and informal…

  4. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  5. On Design Mining: Coevolution and Surrogate Models.

    PubMed

    Preen, Richard J; Bull, Larry

    2017-01-01

    Design mining is the use of computational intelligence techniques to iteratively search and model the attribute space of physical objects evaluated directly through rapid prototyping to meet given objectives. It enables the exploitation of novel materials and processes without formal models or complex simulation. In this article, we focus upon the coevolutionary nature of the design process when it is decomposed into concurrent sub-design-threads due to the overall complexity of the task. Using an abstract, tunable model of coevolution, we consider strategies to sample subthread designs for whole-system testing and how best to construct and use surrogate models within the coevolutionary scenario. Drawing on our findings, we then describe the effective design of an array of six heterogeneous vertical-axis wind turbines.

  6. A model of the human observer and decision maker

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1981-01-01

    The decision process is described in terms of classical sequential decision theory by considering the hypothesis that an abnormal condition has occurred by means of a generalized likelihood ratio test. For this, a sufficient statistic is provided by the innovation sequence which is the result of the perception an information processing submodel of the human observer. On the basis of only two model parameters, the model predicts the decision speed/accuracy trade-off and various attentional characteristics. A preliminary test of the model for single variable failure detection tasks resulted in a very good fit of the experimental data. In a formal validation program, a variety of multivariable failure detection tasks was investigated and the predictive capability of the model was demonstrated.

  7. What constitutes a good hand offs in the emergency department: a patient's perspective.

    PubMed

    Downey, La Vonne; Zun, Leslie; Burke, Trena

    2013-01-01

    The aim is to determine, from the patient's perspective, what constitutes a good hand-off procedure in the emergency department (ED). The secondary purpose is to evaluate what impact a formalized hand-off had on patient knowledge, throughput and customer service This study used a randomized controlled clinical trial involving two unique hand-off approaches and a convenience sample. The study alternated between the current hand-off process that documented the process but not specific elements (referred to as the informal process) to one using the IPASS the BATON process (considered the formal process). Consenting patients completed a 12-question validated questionnaire on how the process was perceived by patients and about their understanding why they waited in the ED. Statistical analysis using SPSS calculated descriptive frequencies and t-tests. In total 107 patients were enrolled: 50 in the informal and 57 in the formal group. Most patients had positive answers to the customer survey. There were significant differences between formal and informal groups: recalling the oncoming and outgoing physician coming to the patient's bed (p = 0.000), with more formal group recalling that than informal group patients; the oncoming physician introducing him/herself (p = 0.01), with more from the formal group answering yes and the physician discussing tests and implications with formal group patients (p = 0.02). This study was done at an urban inner city ED, a fact that may have skewed its results. A comparison of suburban and rural EDs would make the results stronger. It also reflected a very high level of customer satisfaction within the ED. This lack of variance may have meant that the correlation between customer service and handoffs was missed or underrepresented. There was no codified observation of either those using the IPASS the BATON script or those using informal procedures, so no comparison of level and types of information given between the two groups was done. There could have been a bias of those attending who had internalized the IPASS the BATON procedures and used them even when they were assigned to the informal group. A hand off from one physician to the next in the emergency department is best done using a formalized process. IPASS the BATON is a useful tool for hand off in the ED in part because it involved the patient in the process. The formal hand off increased communication between patient and doctor as its use increased the patient's opportunity to ask and respond to questions. The researchers evaluated an ED physician specific hand-off process and illustrate the value and impact of involving patients in the hand-off process.

  8. Intellectual technologies in the problems of thermal power engineering control: formalization of fuzzy information processing results using the artificial intelligence methodology

    NASA Astrophysics Data System (ADS)

    Krokhin, G.; Pestunov, A.

    2017-11-01

    Exploitation conditions of power stations in variable modes and related changes of their technical state actualized problems of creating models for decision-making and state recognition basing on diagnostics using the fuzzy logic for identification their state and managing recovering processes. There is no unified methodological approach for obtaining the relevant information is a case of fuzziness and inhomogeneity of the raw information about the equipment state. The existing methods for extracting knowledge are usually unable to provide the correspondence between of the aggregates model parameters and the actual object state. The switchover of the power engineering from the preventive repair to the one, which is implemented according to the actual technical state, increased the responsibility of those who estimate the volume and the duration of the work. It may lead to inadequacy of the diagnostics and the decision-making models if corresponding methodological preparations do not take fuzziness into account, because the nature of the state information is of this kind. In this paper, we introduce a new model which formalizes the equipment state using not only exact information, but fuzzy as well. This model is more adequate to the actual state, than traditional analogs, and may be used in order to increase the efficiency and the service period of the power installations.

  9. FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER PHYSICAL SYSTEMS

    DTIC Science & Technology

    2018-02-23

    FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL SYSTEMS UNIVERSITY OF TEXAS AT ARLINGTON FEBRUARY 2018 FINAL...COVERED (From - To) APR 2015 – APR 2017 4. TITLE AND SUBTITLE FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL ...dated 16 Jan 09 13. SUPPLEMENTARY NOTES 14. ABSTRACT This project studied emergent behavior in distributed cyber- physical systems (DCPS). Emergent

  10. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  11. Improving Learner Outcomes in Lifelong Education: Formal Pedagogies in Non-Formal Learning Contexts?

    ERIC Educational Resources Information Center

    Zepke, Nick; Leach, Linda

    2006-01-01

    This article explores how far research findings about successful pedagogies in formal post-school education might be used in non-formal learning contexts--settings where learning may not lead to formal qualifications. It does this by examining a learner outcomes model adapted from a synthesis of research into retention. The article first…

  12. Generating Phenotypical Erroneous Human Behavior to Evaluate Human-automation Interaction Using Model Checking

    PubMed Central

    Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.

    2012-01-01

    Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F

    The application of the methodology developed by the GenIV International Forum's (GIF's) Proliferation Resistance and Physical Protection (PR&PP) Working Group is an expert elicitation. Although the framework of the methodology is structured and systematic, it does not by itself constitute or require a formal elicitation. However, formal elicitation can be utilized in the PR&PP context to provide a systematic, credible and transparent qualitative analysis and develop input for quantitative analyses. This section provides an overview of expert elicitations, a discussion of the role formal expert elicitations can play in the PR&PP methodology, an outline of the formal expert elicitation processmore » and a brief practical guide to conducting formal expert elicitations. Expert elicitation is a process utilizing knowledgeable people in cases, for example, when an assessment is needed but physically based data is absent or open to interpretation. More specifically, it can be used to: (1) predict future events; (2) provide estimates on new, rare, complex or poorly understood phenomena; (3) integrate or interpret existing information; or (4) determine what is currently known, how well it is known or what is worth learning in a field. Expert elicitation can be informal or formal. The informal application of expert judgment is frequently used. Although it can produce good results, it often provides demonstrably biased or otherwise flawed answers to problems. This along with the absence of transparency can result in a loss of confidence when experts speak on issues. More formal expert elicitation is a structured process that makes use of people knowledgeable in certain areas to make assessments. The reason for advocating formal use is that the quality and accuracy of expert judgment comes from the completeness of the expert's understanding of the phenomena and the process used to elicit and analyze the data. The use of a more formal process to obtain, lU1derstand and analyze expert judgment has led to an improved acceptance of expert judgment because of the rigor and transparency of the results.« less

  14. Decisional Conflict: Relationships Between and Among Family Context Variables in Cancer Survivors.

    PubMed

    Lim, Jung-Won; Shon, En-Jung

    2016-07-01

    To investigate the relationships among life stress, family functioning, family coping, reliance on formal and informal resources, and decisional conflict in cancer survivors. 
. Cross-sectional.
. Participants were recruited from the California Cancer Surveillance Program, hospital registries, and community agencies in southern California and Cleveland, Ohio. 
. 243 European American, African American, Chinese American, and Korean American cancer survivors diagnosed with breast, colorectal, or prostate cancer.
. The merged data from an ethnically diverse cohort of cancer survivors participating in the two survey studies were used. Standardized measures were used to identify family context variables and decisional conflict. 
. Life stress, family functioning, family coping, reliance on formal and informal resources, and decisional conflict.
. Structural equation modeling demonstrated that life stress was significantly associated with decisional conflict. Family functioning significantly mediated the impact of life stress on decisional conflict through family coping. Reliance on formal and informal resources moderated the relationships among the study variables. 
. The role of the family context, which includes family functioning and coping, on decisional conflict is important in the adjustment process to make high-quality decisions in cancer survivorship care. 
. Findings present nursing practice and research implications that highlight the need for efforts to encourage and support family involvement in the decision-making process and to enhance cancer survivors' adjustment process.

  15. Comparing single- and dual-process models of memory development.

    PubMed

    Hayes, Brett K; Dunn, John C; Joubert, Amy; Taylor, Robert

    2017-11-01

    This experiment examined single-process and dual-process accounts of the development of visual recognition memory. The participants, 6-7-year-olds, 9-10-year-olds and adults, were presented with a list of pictures which they encoded under shallow or deep conditions. They then made recognition and confidence judgments about a list containing old and new items. We replicated the main trends reported by Ghetti and Angelini () in that recognition hit rates increased from 6 to 9 years of age, with larger age changes following deep than shallow encoding. Formal versions of the dual-process high threshold signal detection model and several single-process models (equal variance signal detection, unequal variance signal detection, mixture signal detection) were fit to the developmental data. The unequal variance and mixture signal detection models gave a better account of the data than either of the other models. A state-trace analysis found evidence for only one underlying memory process across the age range tested. These results suggest that single-process memory models based on memory strength are a viable alternative to dual-process models for explaining memory development. © 2016 John Wiley & Sons Ltd.

  16. A sharp interface model for void growth in irradiated materials

    NASA Astrophysics Data System (ADS)

    Hochrainer, Thomas; El-Azab, Anter

    2015-03-01

    A thermodynamic formalism for the interaction of point defects with free surfaces in single-component solids has been developed and applied to the problem of void growth by absorption of point defects in irradiated metals. This formalism consists of two parts, a detailed description of the dynamics of defects within the non-equilibrium thermodynamic frame, and the application of the second law of thermodynamics to provide closure relations for all kinetic equations. Enforcing the principle of non-negative entropy production showed that the description of the problem of void evolution under irradiation must include a relationship between the normal fluxes of defects into the void surface and the driving thermodynamic forces for the void surface motion; these thermodynamic forces are identified for both vacancies and interstitials and the relationships between these forces and the normal point defect fluxes are established using the concepts of transition state theory. The latter theory implies that the defect accommodation into the surface is a thermally activated process. Numerical examples are given to illustrate void growth dynamics in this new formalism and to investigate the effect of the surface energy barriers on void growth. Consequences for phase field models of void growth are discussed.

  17. Adolescent Identity: Rational vs. Experiential Processing, Formal Operations, and Critical Thinking Beliefs.

    ERIC Educational Resources Information Center

    Klaczynski, Paul A.; Fauth, James M.; Swanger, Amy

    1998-01-01

    The extent to which adolescents rely on rational versus experiential information processing was studied with 49 adolescents administered multiple measures of formal operations, two critical thinking questionnaires, a measure of rational processing, and a measure of ego identity status. Implications for studies of development are discussed in terms…

  18. On the formalization and reuse of scientific research.

    PubMed

    King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N

    2011-10-07

    The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.

  19. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  20. Communication: GAIMS—Generalized Ab Initio Multiple Spawning for both internal conversion and intersystem crossing processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curchod, Basile F. E.; Martínez, Todd J., E-mail: toddjmartinez@gmail.com; SLAC National Accelerator Laboratory, Menlo Park, California 94025

    2016-03-14

    Full multiple spawning is a formally exact method to describe the excited-state dynamics of molecular systems beyond the Born-Oppenheimer approximation. However, it has been limited until now to the description of radiationless transitions taking place between electronic states with the same spin multiplicity. This Communication presents a generalization of the full and ab initio multiple spawning methods to both internal conversion (mediated by nonadiabatic coupling terms) and intersystem crossing events (triggered by spin-orbit coupling matrix elements) based on a spin-diabatic representation. The results of two numerical applications, a model system and the deactivation of thioformaldehyde, validate the presented formalism andmore » its implementation.« less

  1. On the adequacy of current empirical evaluations of formal models of categorization.

    PubMed

    Wills, Andy J; Pothos, Emmanuel M

    2012-01-01

    Categorization is one of the fundamental building blocks of cognition, and the study of categorization is notable for the extent to which formal modeling has been a central and influential component of research. However, the field has seen a proliferation of noncomplementary models with little consensus on the relative adequacy of these accounts. Progress in assessing the relative adequacy of formal categorization models has, to date, been limited because (a) formal model comparisons are narrow in the number of models and phenomena considered and (b) models do not often clearly define their explanatory scope. Progress is further hampered by the practice of fitting models with arbitrarily variable parameters to each data set independently. Reviewing examples of good practice in the literature, we conclude that model comparisons are most fruitful when relative adequacy is assessed by comparing well-defined models on the basis of the number and proportion of irreversible, ordinal, penetrable successes (principles of minimal flexibility, breadth, good-enough precision, maximal simplicity, and psychological focus).

  2. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  3. Fluent, fast, and frugal? A formal model evaluation of the interplay between memory, fluency, and comparative judgments.

    PubMed

    Hilbig, Benjamin E; Erdfelder, Edgar; Pohl, Rüdiger F

    2011-07-01

    A new process model of the interplay between memory and judgment processes was recently suggested, assuming that retrieval fluency-that is, the speed with which objects are recognized-will determine inferences concerning such objects in a single-cue fashion. This aspect of the fluency heuristic, an extension of the recognition heuristic, has remained largely untested due to methodological difficulties. To overcome the latter, we propose a measurement model from the class of multinomial processing tree models that can estimate true single-cue reliance on recognition and retrieval fluency. We applied this model to aggregate and individual data from a probabilistic inference experiment and considered both goodness of fit and model complexity to evaluate different hypotheses. The results were relatively clear-cut, revealing that the fluency heuristic is an unlikely candidate for describing comparative judgments concerning recognized objects. These findings are discussed in light of a broader theoretical view on the interplay of memory and judgment processes.

  4. On the Equivalence of Formal Grammars and Machines.

    ERIC Educational Resources Information Center

    Lund, Bruce

    1991-01-01

    Explores concepts of formal language and automata theory underlying computational linguistics. A computational formalism is described known as a "logic grammar," with which computational systems process linguistic data, with examples in declarative and procedural semantics and definite clause grammars. (13 references) (CB)

  5. Toward the Characterization of Non-Formal Pedagogy.

    ERIC Educational Resources Information Center

    Silberman-Keller, Diana

    This study examined characteristic attributes of non-formal education and the non-formal pedagogy directing its teaching and learning processes. Data were collected on organizational and pedagogical characteristics in several out-of-school organizations (youth movements, youth organizations, community centers, bypass educational systems, local…

  6. Demographics of reintroduced populations: estimation, modeling, and decision analysis

    USGS Publications Warehouse

    Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.

    2013-01-01

    Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.

  7. The making of the modern airport executive: Causal connections among key attributes in career development, compromise, and satisfaction in airport management

    NASA Astrophysics Data System (ADS)

    Byers, David Alan

    The purpose of this study was to identify specific career development attributes of contemporary senior-level airport executives and to evaluate the relationship of these attributes to the level of satisfaction airport executives have in their career choice. Attribute sets that were examined included early aviation interests, health factors, psychological factors, demographic factors, formal education, and other aviation-related experiences. A hypothesized causal model that expressed direct and indirect effects among these attributes relative to airport executives' career satisfaction was tested using sample data collected from 708 airport executives from general aviation and commercial service airport throughout the United States. Applying a multiple regression analysis strategy to the model, the overall results revealed that 16% of the variability in airport executives' career satisfaction scores was due to the collective influence of the six research attribute sets, this was significant. The results of the path analysis also indicated that four attribute sets (early aviation interests, health factors, formal education, and other aviation-related experiences) had respective direct significant effects on participants' career satisfaction. Early aviation interests, health factors, and demographic factors had additional indirect effects on career satisfaction; all were mediated by formal education attitude. These results were inconsistent with the hypothesized path model and a revised model was developed to reflect the sample data. The findings suggest that airport executives, as a group, are satisfied with their career choice. Early aviation interests appear to play an important role for influencing the career field selection phase of career development. The study also suggests health factors, formal education, and other aviation-related experiences such as flight training or military experience influence the compromise phase of career development. Each of these four factors had significant effects on career satisfaction. In addition to its applicability to airport executives, the study provides a generalized path model for investigating factors influencing the career development, compromise, and satisfaction process in other vocations.

  8. Integrating public risk perception into formal natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Plattner, Th.; Plapp, T.; Hebel, B.

    2006-06-01

    An urgent need to take perception into account for risk assessment has been pointed out by relevant literature, its impact in terms of risk-related behaviour by individuals is obvious. This study represents an effort to overcome the broadly discussed question of whether risk perception is quantifiable or not by proposing a still simple but applicable methodology. A novel approach is elaborated to obtain a more accurate and comprehensive quantification of risk in comparison to present formal risk evaluation practice. A consideration of relevant factors enables a explicit quantification of individual risk perception and evaluation. The model approach integrates the effective individual risk reff and a weighted mean of relevant perception affecting factors PAF. The relevant PAF cover voluntariness of risk-taking, individual reducibility of risk, knowledge and experience, endangerment, subjective damage rating and subjective recurrence frequency perception. The approach assigns an individual weight to each PAF to represent its impact magnitude. The quantification of these weights is target-group-dependent (e.g. experts, laypersons) and may be effected by psychometric methods. The novel approach is subject to a plausibility check using data from an expert-workshop. A first model application is conducted by means of data of an empirical risk perception study in Western Germany to deduce PAF and weight quantification as well as to confirm and evaluate model applicbility and flexibility. Main fields of application will be a quantification of risk perception by individual persons in a formal and technical way e.g. for the purpose of risk communication issues in illustrating differing perspectives of experts and non-experts. For decision making processes this model will have to be applied with caution, since it is by definition not designed to quantify risk acceptance or risk evaluation. The approach may well explain how risk perception differs, but not why it differs. The formal model generates only "snap shots" and considers neither the socio-cultural nor the historical context of risk perception, since it is a highly individualistic and non-contextual approach.

  9. Mapping the work-based learning of novice teachers: charting some rich terrain.

    PubMed

    Cook, Vivien

    2009-12-01

    Work-based non-formal learning plays a key role in faculty development yet these processes are yet to be described in detail in medical education. This study sets out to illuminate these processes so that potential benefits for new and inexperienced medical educators and their mentors can be realised. The non-formal learning processes of 12 novice teachers were investigated across hospital, general practice and medical school settings. The research sought to describe 'what' and 'how' non-formal learning takes place, and whether these processes differ across teaching sites. Both clinical and non-clinical teachers of medical undergraduates from one inner city medical school were recruited for the study. Through semi-structured interviews and a 'concept map', participants were asked to identify the people and tasks which they considered central to helping them become more expert as educators. Results identified non-formal learning across a number of key dimensions, including personal development, task and role performance, and optimising clinical teaching. This learning takes place as an outcome of experience, observation, reflection and student feedback. Non-formal learning is a significant aspect of the development of novice teachers and as such it needs to be placed more firmly upon the agenda of faculty development.

  10. How Framing Statistical Statements Affects Subjective Veracity: Validation and Application of a Multinomial Model for Judgments of Truth

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.

    2012-01-01

    Extending the well-established negativity bias in human cognition to truth judgments, it was recently shown that negatively framed statistical statements are more likely to be considered true than formally equivalent statements framed positively. However, the underlying processes responsible for this effect are insufficiently understood.…

  11. Extremely Selective Attention: Eye-Tracking Studies of the Dynamic Allocation of Attention to Stimulus Features in Categorization

    ERIC Educational Resources Information Center

    Blair, Mark R.; Watson, Marcus R.; Walshe, R. Calen; Maj, Fillip

    2009-01-01

    Humans have an extremely flexible ability to categorize regularities in their environment, in part because of attentional systems that allow them to focus on important perceptual information. In formal theories of categorization, attention is typically modeled with weights that selectively bias the processing of stimulus features. These theories…

  12. The Pendulum: A Paradigm for the Linear Oscillator

    ERIC Educational Resources Information Center

    Newburgh, Ronald

    2004-01-01

    The simple pendulum is a model for the linear oscillator. The usual mathematical treatment of the problem begins with a differential equation that one solves with the techniques of the differential calculus, a formal process that tends to obscure the physics. In this paper we begin with a kinematic description of the motion obtained by experiment…

  13. Using an "Open Approach" to Create a New, Innovative Higher Education Model

    ERIC Educational Resources Information Center

    Huggins, Susan; Smith, Peter

    2015-01-01

    Navigating learning, formal or informal, can be overwhelming, confusing, and impersonal. With more options than ever, the process of deciding what, where, and when can be overwhelming to a learner. The concept of Open College at Kaplan University (OC@KU) was to bring organization, purpose, and personalization of learning caused by vast resources…

  14. A Novel Conceptual Model of Environmental Communal Education: Content Analysis Based on Distance Education Approach

    ERIC Educational Resources Information Center

    Hafezi, Soheila; Shobeiri, Seyed Mohammad; Sarmadi, Mohammad Reza; Ebadi, Abbas

    2013-01-01

    Environmental education as a learning process increases people's knowledge and awareness about the environment. Although in some countries, the Environmental Communal Education (ECE) is the core of the environmental education by formal and informal organizations and groups, but, it has not clarified the meaning of the ECE's concept. Therefore the…

  15. The challenge of staying happier: testing the Hedonic Adaptation Prevention model.

    PubMed

    Sheldon, Kennon M; Lyubomirsky, Sonja

    2012-05-01

    The happiness that comes from a particular success or change in fortune abates with time. The Hedonic Adaptation Prevention (HAP) model specifies two routes by which the well-being gains derived from a positive life change are eroded--the first involving bottom-up processes (i.e., declining positive emotions generated by the positive change) and the second involving top-down processes (i.e., increased aspirations for even more positivity). The model also specifies two moderators that can forestall these processes--continued appreciation of the original life change and continued variety in change-related experiences. The authors formally tested the predictions of the HAP model in a 3-month three-wave longitudinal study of 481 students. Temporal path analyses and moderated regression analyses provided good support for the model. Implications for the stability of well-being, the feasibility of "the pursuit of happiness," and the appeal of overconsumption are discussed.

  16. Archetypal dynamics, emergent situations, and the reality game.

    PubMed

    Sulis, William

    2010-07-01

    The classical approach to the modeling of reality is founded upon its objectification. Although successful dealing with inanimate matter, objectification has proven to be much less successful elsewhere, sometimes to the point of paradox. This paper discusses an approach to the modeling of reality based upon the concept of process as formulated within the framework of archetypal dynamics. Reality is conceptualized as an intermingling of information-transducing systems, together with the semantic frames that effectively describe and ascribe meaning to each system, along with particular formal representations of same which constitute the archetypes. Archetypal dynamics is the study of the relationships between systems, frames and their representations and the flow of information among these different entities. In this paper a specific formal representation of archetypal dynamics using tapestries is given, and a dynamics is founded upon this representation in the form of a combinatorial game called a reality game. Some simple examples are presented.

  17. ecco: An error correcting comparator theory.

    PubMed

    Ghirlanda, Stefano

    2018-03-08

    Building on the work of Ralph Miller and coworkers (Miller and Matzel, 1988; Denniston et al., 2001; Stout and Miller, 2007), I propose a new formalization of the comparator hypothesis that seeks to overcome some shortcomings of existing formalizations. The new model, dubbed ecco for "Error-Correcting COmparisons," retains the comparator process and the learning of CS-CS associations based on contingency. ecco assumes, however, that learning of CS-US associations is driven by total error correction, as first introduced by Rescorla and Wagner (1972). I explore ecco's behavior in acquisition, compound conditioning, blocking, backward blocking, and unovershadowing. In these paradigms, ecco appears capable of avoiding the problems of current comparator models, such as the inability to solve some discriminations and some paradoxical effects of stimulus salience. At the same time, ecco exhibits the retrospective revaluation phenomena that are characteristic of comparator theory. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. An engineering approach to automatic programming

    NASA Technical Reports Server (NTRS)

    Rubin, Stuart H.

    1990-01-01

    An exploratory study of the automatic generation and optimization of symbolic programs using DECOM - a prototypical requirement specification model implemented in pure LISP was undertaken. It was concluded, on the basis of this study, that symbolic processing languages such as LISP can support a style of programming based upon formal transformation and dependent upon the expression of constraints in an object-oriented environment. Such languages can represent all aspects of the software generation process (including heuristic algorithms for effecting parallel search) as dynamic processes since data and program are represented in a uniform format.

  19. Planning treatment of ischemic heart disease with partially observable Markov decision processes.

    PubMed

    Hauskrecht, M; Fraser, H

    2000-03-01

    Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead, they are very often dependent and interleaved over time. This is mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of partially observable Markov decision processes (POMDPs) developed and used in the operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In this paper, we show how the POMDP framework can be used to model and solve the problem of the management of patients with ischemic heart disease (IHD), and demonstrate the modeling advantages of the framework over standard decision formalisms.

  20. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  1. Family Literacy and the New Canadian: Formal, Non-Formal and Informal Learning: The Case of Literacy, Essential Skills and Language Learning in Canada

    ERIC Educational Resources Information Center

    Eaton, Sarah Elaine

    2011-01-01

    This paper examines literacy and language learning across the lifespan within the context of immigrants in the Canadian context. It explores the process of improving literacy skills and acquiring second or third language skills through the systems of formal, non-formal and informal learning, as defined by the OECD [Organisation for Economic…

  2. A new formalism for modelling parameters α and β of the linear-quadratic model of cell survival for hadron therapy

    NASA Astrophysics Data System (ADS)

    Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe

    2017-10-01

    We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.

  3. Formal Analysis of the Remote Agent Before and After Flight

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.

    2000-01-01

    This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.

  4. Formal Specification and Validation of a Hybrid Connectivity Restoration Algorithm for Wireless Sensor and Actor Networks †

    PubMed Central

    Imran, Muhammad; Zafar, Nazir Ahmad

    2012-01-01

    Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.

  5. OFFl Models: Novel Schema for Dynamical Modeling of Biological Systems

    PubMed Central

    2016-01-01

    Flow diagrams are a common tool used to help build and interpret models of dynamical systems, often in biological contexts such as consumer-resource models and similar compartmental models. Typically, their usage is intuitive and informal. Here, we present a formalized version of flow diagrams as a kind of weighted directed graph which follow a strict grammar, which translate into a system of ordinary differential equations (ODEs) by a single unambiguous rule, and which have an equivalent representation as a relational database. (We abbreviate this schema of “ODEs and formalized flow diagrams” as OFFL.) Drawing a diagram within this strict grammar encourages a mental discipline on the part of the modeler in which all dynamical processes of a system are thought of as interactions between dynamical species that draw parcels from one or more source species and deposit them into target species according to a set of transformation rules. From these rules, the net rate of change for each species can be derived. The modeling schema can therefore be understood as both an epistemic and practical heuristic for modeling, serving both as an organizational framework for the model building process and as a mechanism for deriving ODEs. All steps of the schema beyond the initial scientific (intuitive, creative) abstraction of natural observations into model variables are algorithmic and easily carried out by a computer, thus enabling the future development of a dedicated software implementation. Such tools would empower the modeler to consider significantly more complex models than practical limitations might have otherwise proscribed, since the modeling framework itself manages that complexity on the modeler’s behalf. In this report, we describe the chief motivations for OFFL, carefully outline its implementation, and utilize a range of classic examples from ecology and epidemiology to showcase its features. PMID:27270918

  6. Abstracting event-based control models for high autonomy systems

    NASA Technical Reports Server (NTRS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  7. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  8. Conditional Random Fields for Activity Recognition

    DTIC Science & Technology

    2008-04-01

    final match. The final is never used as a training or hold out set. Table 4.1 lists the roles of the CMDragons’07 robot soccer team. The role of Goalie ...is not included because the goalie never changes roles. The classification task, which we formalize below, is to recognize robot roles from the avail...process and pull out the key information from the sensor data. Furthermore, as conditional models, CRFs do not waste modeling effort on the observations

  9. Data accuracy assessment using enterprise architecture

    NASA Astrophysics Data System (ADS)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  10. Perceiving while producing: Modeling the dynamics of phonological planning

    PubMed Central

    Roon, Kevin D.; Gafos, Adamantios I.

    2016-01-01

    We offer a dynamical model of phonological planning that provides a formal instantiation of how the speech production and perception systems interact during online processing. The model is developed on the basis of evidence from an experimental task that requires concurrent use of both systems, the so-called response-distractor task in which speakers hear distractor syllables while they are preparing to produce required responses. The model formalizes how ongoing response planning is affected by perception and accounts for a range of results reported across previous studies. It does so by explicitly addressing the setting of parameter values in representations. The key unit of the model is that of the dynamic field, a distribution of activation over the range of values associated with each representational parameter. The setting of parameter values takes place by the attainment of a stable distribution of activation over the entire field, stable in the sense that it persists even after the response cue in the above experiments has been removed. This and other properties of representations that have been taken as axiomatic in previous work are derived by the dynamics of the proposed model. PMID:27440947

  11. Computational models of music perception and cognition I: The perceptual and cognitive processing chain

    NASA Astrophysics Data System (ADS)

    Purwins, Hendrik; Herrera, Perfecto; Grachten, Maarten; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    We present a review on perception and cognition models designed for or applicable to music. An emphasis is put on computational implementations. We include findings from different disciplines: neuroscience, psychology, cognitive science, artificial intelligence, and musicology. The article summarizes the methodology that these disciplines use to approach the phenomena of music understanding, the localization of musical processes in the brain, and the flow of cognitive operations involved in turning physical signals into musical symbols, going from the transducers to the memory systems of the brain. We discuss formal models developed to emulate, explain and predict phenomena involved in early auditory processing, pitch processing, grouping, source separation, and music structure computation. We cover generic computational architectures of attention, memory, and expectation that can be instantiated and tuned to deal with specific musical phenomena. Criteria for the evaluation of such models are presented and discussed. Thereby, we lay out the general framework that provides the basis for the discussion of domain-specific music models in Part II.

  12. Intuition, deliberation, and the evolution of cooperation

    PubMed Central

    Bear, Adam; Rand, David G.

    2016-01-01

    Humans often cooperate with strangers, despite the costs involved. A long tradition of theoretical modeling has sought ultimate evolutionary explanations for this seemingly altruistic behavior. More recently, an entirely separate body of experimental work has begun to investigate cooperation’s proximate cognitive underpinnings using a dual-process framework: Is deliberative self-control necessary to reign in selfish impulses, or does self-interested deliberation restrain an intuitive desire to cooperate? Integrating these ultimate and proximate approaches, we introduce dual-process cognition into a formal game-theoretic model of the evolution of cooperation. Agents play prisoner’s dilemma games, some of which are one-shot and others of which involve reciprocity. They can either respond by using a generalized intuition, which is not sensitive to whether the game is one-shot or reciprocal, or pay a (stochastically varying) cost to deliberate and tailor their strategy to the type of game they are facing. We find that, depending on the level of reciprocity and assortment, selection favors one of two strategies: intuitive defectors who never deliberate, or dual-process agents who intuitively cooperate but sometimes use deliberation to defect in one-shot games. Critically, selection never favors agents who use deliberation to override selfish impulses: Deliberation only serves to undermine cooperation with strangers. Thus, by introducing a formal theoretical framework for exploring cooperation through a dual-process lens, we provide a clear answer regarding the role of deliberation in cooperation based on evolutionary modeling, help to organize a growing body of sometimes-conflicting empirical results, and shed light on the nature of human cognition and social decision making. PMID:26755603

  13. Intuition, deliberation, and the evolution of cooperation.

    PubMed

    Bear, Adam; Rand, David G

    2016-01-26

    Humans often cooperate with strangers, despite the costs involved. A long tradition of theoretical modeling has sought ultimate evolutionary explanations for this seemingly altruistic behavior. More recently, an entirely separate body of experimental work has begun to investigate cooperation's proximate cognitive underpinnings using a dual-process framework: Is deliberative self-control necessary to reign in selfish impulses, or does self-interested deliberation restrain an intuitive desire to cooperate? Integrating these ultimate and proximate approaches, we introduce dual-process cognition into a formal game-theoretic model of the evolution of cooperation. Agents play prisoner's dilemma games, some of which are one-shot and others of which involve reciprocity. They can either respond by using a generalized intuition, which is not sensitive to whether the game is one-shot or reciprocal, or pay a (stochastically varying) cost to deliberate and tailor their strategy to the type of game they are facing. We find that, depending on the level of reciprocity and assortment, selection favors one of two strategies: intuitive defectors who never deliberate, or dual-process agents who intuitively cooperate but sometimes use deliberation to defect in one-shot games. Critically, selection never favors agents who use deliberation to override selfish impulses: Deliberation only serves to undermine cooperation with strangers. Thus, by introducing a formal theoretical framework for exploring cooperation through a dual-process lens, we provide a clear answer regarding the role of deliberation in cooperation based on evolutionary modeling, help to organize a growing body of sometimes-conflicting empirical results, and shed light on the nature of human cognition and social decision making.

  14. The Volcanism Ontology (VO): a model of the volcanic system

    NASA Astrophysics Data System (ADS)

    Myer, J.; Babaie, H. A.

    2017-12-01

    We have modeled a part of the complex material and process entities and properties of the volcanic system in the Volcanism Ontology (VO) applying several top-level ontologies such as Basic Formal Ontology (BFO), SWEET, and Ontology of Physics for Biology (OPB) within a single framework. The continuant concepts in BFO describe features with instances that persist as wholes through time and have qualities (attributes) that may change (e.g., state, composition, and location). In VO, the continuants include lava, volcanic rock, and volcano. The occurrent concepts in BFO include processes, their temporal boundaries, and the spatio-temporal regions within which they occur. In VO, these include eruption (process), the onset of pyroclastic flow (temporal boundary), and the space and time span of the crystallization of lava in a lava tube (spatio-temporal region). These processes can be of physical (e.g., debris flow, crystallization, injection), atmospheric (e.g., vapor emission, ash particles blocking solar radiation), hydrological (e.g., diffusion of water vapor, hot spring), thermal (e.g., cooling of lava) and other types. The properties (predicates) relate continuants to other continuants, occurrents to continuants, and occurrents to occurrents. The ontology also models other concepts such as laboratory and field procedures by volcanologists, sampling by sensors, and the type of instruments applied in monitoring volcanic activity. When deployed on the web, VO will be used to explicitly and formally annotate data and information collected by volcanologists based on domain knowledge. This will enable the integration of global volcanic data and improve the interoperability of software that deal with such data.

  15. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  16. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  17. Defining Uniform Processes for Remediation, Probation and Termination in Residency Training.

    PubMed

    Smith, Jessica L; Lypson, Monica; Silverberg, Mark; Weizberg, Moshe; Murano, Tiffany; Lukela, Michael; Santen, Sally A

    2017-01-01

    It is important that residency programs identify trainees who progress appropriately, as well as identify residents who fail to achieve educational milestones as expected so they may be remediated. The process of remediation varies greatly across training programs, due in part to the lack of standardized definitions for good standing, remediation, probation, and termination . The purpose of this educational advancement is to propose a clear remediation framework including definitions, management processes, documentation expectations and appropriate notifications. Informal remediation is initiated when a resident's performance is deficient in one or more of the outcomes-based milestones established by the Accreditation Council for Graduate Medical Education, but not significant enough to trigger formal remediation. Formal remediation occurs when deficiencies are significant enough to warrant formal documentation because informal remediation failed or because issues are substantial. The process includes documentation in the resident's file and notification of the graduate medical education office; however, the documentation is not disclosed if the resident successfully remediates. Probation is initiated when a resident is unsuccessful in meeting the terms of formal remediation or if initial problems are significant enough to warrant immediate probation. The process is similar to formal remediation but also includes documentation extending to the final verification of training and employment letters. Termination involves other stakeholders and occurs when a resident is unsuccessful in meeting the terms of probation or if initial problems are significant enough to warrant immediate termination.

  18. Pedagogical Basis of DAS Formalism in Engineering Education

    ERIC Educational Resources Information Center

    Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.

    2011-01-01

    The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…

  19. A system approach to aircraft optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1991-01-01

    Mutual couplings among the mathematical models of physical phenomena and parts of a system such as an aircraft complicate the design process because each contemplated design change may have a far reaching consequence throughout the system. Techniques are outlined for computing these influences as system design derivatives useful for both judgemental and formal optimization purposes. The techniques facilitate decomposition of the design process into smaller, more manageable tasks and they form a methodology that can easily fit into existing engineering organizations and incorporate their design tools.

  20. Models of the Learner in Computer-Assisted Instruction

    DTIC Science & Technology

    1975-12-01

    De ~.elopment Center San Diego, California 92152 !4" N il 0 ’U ’ UNCI.ASbp kW ILL SECURITY CLASSIVICATION OF TNO;S PAA;E 11%.. Daea t,,i...d) fREAD...and computer programaing . Advo.itional efforts are being, made to extend this approach to less formal subiect matter such as South American geography...Laubsch’s call for a time-dependent forgetting process. Despite the inclusion of a forgetting process, presentation strategies based on the family of

  1. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  2. A method for interactive satellite failure diagnosis: Towards a connectionist solution

    NASA Technical Reports Server (NTRS)

    Bourret, P.; Reggia, James A.

    1989-01-01

    Various kinds of processes which allow one to make a diagnosis are analyzed. The analyses then focuses on one of these processes used for satellite failure diagnosis. This process consists of sending the satellite instructions about system status alterations: to mask the effects of one possible component failure or to look for additional abnormal measures. A formal model of this process is given. This model is an extension of a previously defined connectionist model which allows computation of ratios between the likelihoods of observed manifestations according to various diagnostic hypotheses. The expected mean value of these likelihood measures for each possible status of the satellite can be computed in a similar way. Therefore, it is possible to select the most appropriate status according to three different purposes: to confirm an hypothesis, to eliminate an hypothesis, or to choose between two hypotheses. Finally, a first connectionist schema of computation of these expected mean values is given.

  3. Incorporating interfacial phenomena in solidification models

    NASA Technical Reports Server (NTRS)

    Beckermann, Christoph; Wang, Chao Yang

    1994-01-01

    A general methodology is available for the incorporation of microscopic interfacial phenomena in macroscopic solidification models that include diffusion and convection. The method is derived from a formal averaging procedure and a multiphase approach, and relies on the presence of interfacial integrals in the macroscopic transport equations. In a wider engineering context, these techniques are not new, but their application in the analysis and modeling of solidification processes has largely been overlooked. This article describes the techniques and demonstrates their utility in two examples in which microscopic interfacial phenomena are of great importance.

  4. Defining the paramedic process.

    PubMed

    Carter, Holly; Thompson, James

    2015-01-01

    The use of a 'process of care' is well established in several health professions, most evidently within the field of nursing. Now ingrained within methods of care delivery, it offers a logical approach to problem solving and ensures an appropriate delivery of interventions that are specifically suited to the individual patient. Paramedicine is a rapidly advancing profession despite a wide acknowledgement of limited research provisions. This frequently results in the borrowing of evidence from other disciplines. While this has often been useful, there are many concerns relating to the acceptable limit of evidence transcription between professions. To date, there is no formally recognised 'process of care'-defining activity within the pre-hospital arena. With much current focus on the professional classification of paramedic work, it is considered timely to formally define a formula that underpins other professional roles such as nursing. It is hypothesised that defined processes of care, particularly the nursing process, may have features that would readily translate to pre-hospital practice. The literature analysed was obtained through systematic searches of a range of databases, including Ovid MEDLINE, Cumulative Index to Nursing and Allied Health. The results demonstrated that the defined process of care provides nursing with more than just a structure for practice, but also has implications for education, clinical governance and professional standing. The current nursing process does not directly articulate to the complex and often unstructured role of the paramedic; however, it has many principles that offer value to the paramedic in their practice. Expanding the nursing process model to include the stages of Dispatch Considerations, Scene Assessment, First Impressions, Patient History, Physical Examination, Clinical Decision-Making, Interventions, Re-evaluation, Transport Decisions, Handover and Reflection would provide an appropriate model for pre-hospital practices.

  5. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  6. Interacting hadron resonance gas model in the K -matrix formalism

    NASA Astrophysics Data System (ADS)

    Dash, Ashutosh; Samanta, Subhasis; Mohanty, Bedangadas

    2018-05-01

    An extension of hadron resonance gas (HRG) model is constructed to include interactions using relativistic virial expansion of partition function. The noninteracting part of the expansion contains all the stable baryons and mesons and the interacting part contains all the higher mass resonances which decay into two stable hadrons. The virial coefficients are related to the phase shifts which are calculated using K -matrix formalism in the present work. We have calculated various thermodynamics quantities like pressure, energy density, and entropy density of the system. A comparison of thermodynamic quantities with noninteracting HRG model, calculated using the same number of hadrons, shows that the results of the above formalism are larger. A good agreement between equation of state calculated in K -matrix formalism and lattice QCD simulations is observed. Specifically, the lattice QCD calculated interaction measure is well described in our formalism. We have also calculated second-order fluctuations and correlations of conserved charges in K -matrix formalism. We observe a good agreement of second-order fluctuations and baryon-strangeness correlation with lattice data below the crossover temperature.

  7. Positive Character Development in School Sport Programs. ERIC Digest.

    ERIC Educational Resources Information Center

    Beller, Jennifer

    This digest discusses the formal and informal processes of moral character development through sport in light of the types of programs that have shown to improve moral character, sportsmanship, and fair play, noting that such efforts involve combined lifelong formal and informal educational processes with three interrelated dimensions: knowing,…

  8. Deep first formal concept search.

    PubMed

    Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu

    2014-01-01

    The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.

  9. Institutional barriers and opportunities in application of the limits of acceptable change

    Treesearch

    George H. Stankey

    1997-01-01

    Although the Limits of Acceptable Change (LAC) process has been in use since the mid-1980’s and has contributed to improved wilderness management, significant barriers and challenges remain. Formal and informal institutional barriers are the principal constraint to more effective implementation. Although grounded in a traditional management-by-objectives model, the LAC...

  10. Quantum-Like Models for Decision Making in Psychology and Cognitive Science

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2009-02-01

    We show that (in contrast to rather common opinion) the domain of applications of the mathematical formalism of quantum mechanics is not restricted to physics. This formalism can be applied to the description of various quantum-like (QL) information processing. In particular, the calculus of quantum (and more general QL) probabilities can be used to explain some paradoxical statistical data which was collected in psychology and cognitive science. The main lesson of our study is that one should sharply distinguish the mathematical apparatus of QM from QM as a physical theory. The domain of application of the mathematical apparatus is essentially wider than quantum physics. Quantum-like representation algorithm, formula of total probability, interference of probabilities, psychology, cognition, decision making.

  11. Mathematical formalisms based on approximated kinetic representations for modeling genetic and metabolic pathways.

    PubMed

    Alves, Rui; Vilaprinyo, Ester; Hernádez-Bermejo, Benito; Sorribas, Albert

    2008-01-01

    There is a renewed interest in obtaining a systemic understanding of metabolism, gene expression and signal transduction processes, driven by the recent research focus on Systems Biology. From a biotechnological point of view, such a systemic understanding of how a biological system is designed to work can facilitate the rational manipulation of specific pathways in different cell types to achieve specific goals. Due to the intrinsic complexity of biological systems, mathematical models are a central tool for understanding and predicting the integrative behavior of those systems. Particularly, models are essential for a rational development of biotechnological applications and in understanding system's design from an evolutionary point of view. Mathematical models can be obtained using many different strategies. In each case, their utility will depend upon the properties of the mathematical representation and on the possibility of obtaining meaningful parameters from available data. In practice, there are several issues at stake when one has to decide which mathematical model is more appropriate for the study of a given problem. First, one needs a model that can represent the aspects of the system one wishes to study. Second, one must choose a mathematical representation that allows an accurate analysis of the system with respect to different aspects of interest (for example, robustness of the system, dynamical behavior, optimization of the system with respect to some production goal, parameter value determination, etc). Third, before choosing between alternative and equally appropriate mathematical representations for the system, one should compare representations with respect to easiness of automation for model set-up, simulation, and analysis of results. Fourth, one should also consider how to facilitate model transference and re-usability by other researchers and for distinct purposes. Finally, one factor that is important for all four aspects is the regularity in the mathematical structure of the equations because it facilitates computational manipulation. This regularity is a mark of kinetic representations based on approximation theory. The use of approximation theory to derive mathematical representations with regular structure for modeling purposes has a long tradition in science. In most applied fields, such as engineering and physics, those approximations are often required to obtain practical solutions to complex problems. In this paper we review some of the more popular mathematical representations that have been derived using approximation theory and are used for modeling in molecular systems biology. We will focus on formalisms that are theoretically supported by the Taylor Theorem. These include the Power-law formalism, the recently proposed (log)linear and Lin-log formalisms as well as some closely related alternatives. We will analyze the similarities and differences between these formalisms, discuss the advantages and limitations of each representation, and provide a tentative "road map" for their potential utilization for different problems.

  12. Modeling Cyber Conflicts Using an Extended Petri Net Formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakrzewska, Anita N; Ferragut, Erik M

    2011-01-01

    When threatened by automated attacks, critical systems that require human-controlled responses have difficulty making optimal responses and adapting protections in real- time and may therefore be overwhelmed. Consequently, experts have called for the development of automatic real-time reaction capabilities. However, a technical gap exists in the modeling and analysis of cyber conflicts to automatically understand the repercussions of responses. There is a need for modeling cyber assets that accounts for concurrent behavior, incomplete information, and payoff functions. Furthermore, we address this need by extending the Petri net formalism to allow real-time cyber conflicts to be modeled in a way thatmore » is expressive and concise. This formalism includes transitions controlled by players as well as firing rates attached to transitions. This allows us to model both player actions and factors that are beyond the control of players in real-time. We show that our formalism is able to represent situational aware- ness, concurrent actions, incomplete information and objective functions. These factors make it well-suited to modeling cyber conflicts in a way that allows for useful analysis. MITRE has compiled the Common Attack Pattern Enumera- tion and Classification (CAPEC), an extensive list of cyber attacks at various levels of abstraction. CAPEC includes factors such as attack prerequisites, possible countermeasures, and attack goals. These elements are vital to understanding cyber attacks and to generating the corresponding real-time responses. We demonstrate that the formalism can be used to extract precise models of cyber attacks from CAPEC. Several case studies show that our Petri net formalism is more expressive than other models, such as attack graphs, for modeling cyber conflicts and that it is amenable to exploring cyber strategies.« less

  13. Traffic Games: Modeling Freeway Traffic with Game Theory

    PubMed Central

    Cortés-Berrueco, Luis E.; Gershenson, Carlos; Stephens, Christopher R.

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers’ interactions. PMID:27855176

  14. Traffic Games: Modeling Freeway Traffic with Game Theory.

    PubMed

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  15. Weaving a Formal Methods Education with Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Gibson, J. Paul

    The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.

  16. Photon wave function formalism for analysis of Mach–Zehnder interferometer and sum-frequency generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritboon, Atirach, E-mail: atirach.3.14@gmail.com; Department of Physics, Faculty of Science, Prince of Songkla University, Hat Yai 90112; Daengngam, Chalongrat, E-mail: chalongrat.d@psu.ac.th

    2016-08-15

    Biakynicki-Birula introduced a photon wave function similar to the matter wave function that satisfies the Schrödinger equation. Its second quantization form can be applied to investigate nonlinear optics at nearly full quantum level. In this paper, we applied the photon wave function formalism to analyze both linear optical processes in the well-known Mach–Zehnder interferometer and nonlinear optical processes for sum-frequency generation in dispersive and lossless medium. Results by photon wave function formalism agree with the well-established Maxwell treatments and existing experimental verifications.

  17. Exploring the Processes of Generating LOD (0-2) Citygml Models in Greater Municipality of Istanbul

    NASA Astrophysics Data System (ADS)

    Buyuksalih, I.; Isikdag, U.; Zlatanova, S.

    2013-08-01

    3D models of cities, visualised and exploded in 3D virtual environments have been available for several years. Currently a large number of impressive realistic 3D models have been regularly presented at scientific, professional and commercial events. One of the most promising developments is OGC standard CityGML. CityGML is object-oriented model that support 3D geometry and thematic semantics, attributes and relationships, and offers advanced options for realistic visualization. One of the very attractive characteristics of the model is the support of 5 levels of detail (LOD), starting from 2.5D less accurate model (LOD0) and ending with very detail indoor model (LOD4). Different local government offices and municipalities have different needs when utilizing the CityGML models, and the process of model generation depends on local and domain specific needs. Although the processes (i.e. the tasks and activities) for generating the models differs depending on its utilization purpose, there are also some common tasks (i.e. common denominator processes) in the model generation of City GML models. This paper focuses on defining the common tasks in generation of LOD (0-2) City GML models and representing them in a formal way with process modeling diagrams.

  18. Determinants of formal care use and expenses among in-home elderly in Jing’an district, Shanghai, China

    PubMed Central

    Ding, Hansheng; Wang, Changying; Xie, Chunyan; Yang, Yitong; Jin, Chunlin

    2017-01-01

    The need for formal care among the elderly population has been increasing due to their greater longevity and the evolution of family structure. We examined the determinants of the use and expenses of formal care among in-home elderly adults in Shanghai. A two-part model based on the data from the Shanghai Long-Term Care Needs Assessment Questionnaire was applied. A total of 8428 participants responded in 2014 and 7100 were followed up in 2015. The determinants of the probability of using formal care were analyzed in the first part of the model and the determinants of formal care expenses were analyzed in the second part. Demographic indicators, living arrangements, physical health status, and care type in 2014 were selected as independent variables. We found that individuals of older age; women; those with higher Activities of Daily Living (ADL) scores; those without spouse; those with higher income; those suffering from stroke, dementia, lower limb fracture, or advanced tumor; and those with previous experience of formal and informal care were more likely to receive formal care in 2015. Furthermore, age, income and formal care fee in 2014 were significant predictors of formal care expenses in 2015. Taken together, the results showed that formal care provision in Shanghai was not determined by ADL scores, but was instead more related to income. This implied an inappropriate distribution of formal care among elderly population in Shanghai. Additionally, it appeared difficult for the elderly to quit the formal care once they begun to use it. These results highlighted the importance of assessing the need for formal care, and suggested that the government offer guidance on formal care use for the elderly. PMID:28448628

  19. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  20. Computational Nosology and Precision Psychiatry

    PubMed Central

    Redish, A. David; Gordon, Joshua A.

    2017-01-01

    This article provides an illustrative treatment of psychiatric morbidity that offers an alternative to the standard nosological model in psychiatry. It considers what would happen if we treated diagnostic categories not as causes of signs and symptoms, but as diagnostic consequences of psychopathology and pathophysiology. This reformulation (of the standard nosological model) opens the door to a more natural description of how patients present—and of their likely responses to therapeutic interventions. In brief, we describe a model that generates symptoms, signs, and diagnostic outcomes from latent psychopathological states. In turn, psychopathology is caused by pathophysiological processes that are perturbed by (etiological) causes such as predisposing factors, life events, and therapeutic interventions. The key advantages of this nosological formulation include (i) the formal integration of diagnostic (e.g., DSM) categories and latent psychopathological constructs (e.g., the dimensions of the Research Domain Criteria); (ii) the provision of a hypothesis or model space that accommodates formal, evidence-based hypothesis testing (using Bayesian model comparison); and (iii) the ability to predict therapeutic responses (using a posterior predictive density), as in precision medicine. These and other advantages are largely promissory at present: The purpose of this article is to show what might be possible, through the use of idealized simulations. PMID:29400354

  1. Toward fidelity between specification and implementation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing

    1994-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  2. Verification and validation of a reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  3. Transitions of Care in Medical Education: A Compilation of Effective Teaching Methods.

    PubMed

    McBryde, Meagan; Vandiver, Jeremy W; Onysko, Mary

    2016-04-01

    Transitioning patients safely from the inpatient environment back to an outpatient environment is an important component of health care, and multidisciplinary cooperation and formal processes are necessary to accomplish this task. This Transitions of Care (TOC) process is constantly being shaped in health care systems to improve patient safety, outcomes, and satisfaction. While there are many models that have been published on methods to improve the TOC process systematically, there is no clear roadmap for educators to teach TOC concepts to providers in training. This article reviews published data to highlight specific methods shown to effectively instill these concepts and values into medical students and residents. Formal, evidence-based, TOC curriculum should be developed within medical schools and residency programs. TOC education should ideally begin early in the education process, and its importance should be reiterated throughout the curriculum longitudinally. Curriculum should have a specific focus on recognition of common causes of hospital readmissions, such as medication errors, lack of adequate follow-up visits, and social/economic barriers. Use of didactic lectures, case-based workshops, role-playing activities, home visits, interprofessional activities, and resident-led quality improvement projects have all be shown to be effective ways to teach TOC concepts.

  4. Structured population dynamics: continuous size and discontinuous stage structures.

    PubMed

    Buffoni, Giuseppe; Pasquali, Sara

    2007-04-01

    A nonlinear stochastic model for the dynamics of a population with either a continuous size structure or a discontinuous stage structure is formulated in the Eulerian formalism. It takes into account dispersion effects due to stochastic variability of the development process of the individuals. The discrete equations of the numerical approximation are derived, and an analysis of the existence and stability of the equilibrium states is performed. An application to a copepod population is illustrated; numerical results of Eulerian and Lagrangian models are compared.

  5. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  6. Structural model of control system for hydraulic stepper motor complex

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.

    2018-03-01

    The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.

  7. The Influence of Music Learning Cultures on the Construction of Teaching-Learning Conceptions

    ERIC Educational Resources Information Center

    Casas-Mas, Amalia; Pozo, Juan Ignacio; Montero, Ignacio

    2014-01-01

    Current research in music education tends to put the emphasis on learning processes outside formal academic contexts, both to rethink and to renew academic educational formats. Our aim is to observe and describe three music learning cultures simultaneously, including formal, non-formal and informal settings: Classical, Jazz and Flamenco,…

  8. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  9. Mental health courts and their selection processes: modeling variation for consistency.

    PubMed

    Wolff, Nancy; Fabrikant, Nicole; Belenko, Steven

    2011-10-01

    Admission into mental health courts is based on a complicated and often variable decision-making process that involves multiple parties representing different expertise and interests. To the extent that eligibility criteria of mental health courts are more suggestive than deterministic, selection bias can be expected. Very little research has focused on the selection processes underpinning problem-solving courts even though such processes may dominate the performance of these interventions. This article describes a qualitative study designed to deconstruct the selection and admission processes of mental health courts. In this article, we describe a multi-stage, complex process for screening and admitting clients into mental health courts. The selection filtering model that is described has three eligibility screening stages: initial, assessment, and evaluation. The results of this study suggest that clients selected by mental health courts are shaped by the formal and informal selection criteria, as well as by the local treatment system.

  10. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  11. Construction of non-Markovian coarse-grained models employing the Mori-Zwanzig formalism and iterative Boltzmann inversion

    NASA Astrophysics Data System (ADS)

    Yoshimoto, Yuta; Li, Zhen; Kinefuchi, Ikuya; Karniadakis, George Em

    2017-12-01

    We propose a new coarse-grained (CG) molecular simulation technique based on the Mori-Zwanzig (MZ) formalism along with the iterative Boltzmann inversion (IBI). Non-Markovian dissipative particle dynamics (NMDPD) taking into account memory effects is derived in a pairwise interaction form from the MZ-guided generalized Langevin equation. It is based on the introduction of auxiliary variables that allow for the replacement of a non-Markovian equation with a Markovian one in a higher dimensional space. We demonstrate that the NMDPD model exploiting MZ-guided memory kernels can successfully reproduce the dynamic properties such as the mean square displacement and velocity autocorrelation function of a Lennard-Jones system, as long as the memory kernels are appropriately evaluated based on the Volterra integral equation using the force-velocity and velocity-velocity correlations. Furthermore, we find that the IBI correction of a pair CG potential significantly improves the representation of static properties characterized by a radial distribution function and pressure, while it has little influence on the dynamic processes. Our findings suggest that combining the advantages of both the MZ formalism and IBI leads to an accurate representation of both the static and dynamic properties of microscopic systems that exhibit non-Markovian behavior.

  12. An Evaluation of the Preceptor Model versus the Formal Teaching Model.

    ERIC Educational Resources Information Center

    Shamian, Judith; Lemieux, Suzanne

    1984-01-01

    This study evaluated the effectiveness of two teaching methods to determine which is more effective in enhancing the knowledge base of participating nurses: the preceptor model embodies decentralized instruction by a member of the nursing staff, and the formal teaching model uses centralized teaching by the inservice education department. (JOW)

  13. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  14. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  15. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  16. An approach to verification and validation of a reliable multicasting protocol: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this interactive, iterative approach to development allows software designers to focus on delivery of nominal functionality while the V&V team can focus on analysis of off nominal cases. Testing serves as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP. Although RMP has provided our research effort with a rich set of test cases, it also has practical applications within NASA. For example, RMP is being considered for use in the NASA EOSDIS project due to its significant performance benefits in applications that need to replicate large amounts of data to many network sites.

  17. TUNS/TCIS information model/process model

    NASA Technical Reports Server (NTRS)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  18. From non-trivial geometries to power spectra and vice versa

    NASA Astrophysics Data System (ADS)

    Brooker, D. J.; Tsamis, N. C.; Woodard, R. P.

    2018-04-01

    We review a recent formalism which derives the functional forms of the primordial—tensor and scalar—power spectra of scalar potential inflationary models. The formalism incorporates the case of geometries with non-constant first slow-roll parameter. Analytic expressions for the power spectra are given that explicitly display the dependence on the geometric properties of the background. Moreover, we present the full algorithm for using our formalism, to reconstruct the model from the observed power spectra. Our techniques are applied to models possessing "features" in their potential with excellent agreement.

  19. An approach for investigation of secure access processes at a combined e-learning environment

    NASA Astrophysics Data System (ADS)

    Romansky, Radi; Noninska, Irina

    2017-12-01

    The article discuses an approach to investigate processes for regulation the security and privacy control at a heterogenous e-learning environment realized as a combination of traditional and cloud means and tools. Authors' proposal for combined architecture of e-learning system is presented and main subsystems and procedures are discussed. A formalization of the processes for using different types resources (public, private internal and private external) is proposed. The apparatus of Markovian chains (MC) is used for modeling and analytical investigation of the secure access to the resources is used and some assessments are presented.

  20. Factors That Effect Interagency Collaborations: Lessons During and Following the 2002 Winter Olympics

    DTIC Science & Technology

    2008-03-01

    solving Formal control ( decision making ) Strategic planning (structure or process) Barriers PROBE / Ticklers Were there incentives... making ) Strategic planning (structure or process) 74 PROBE / Ticklers To what extend does interdependence needed for these...aspect Motivation Social capital Trust Leadership Interpersonal communication (people skills) Shared problem solving Formal control ( decision

  1. A Formal Construction of Term Classes. Technical Report No. TR73-18.

    ERIC Educational Resources Information Center

    Yu, Clement T.

    The computational complexity of a formal process for the construction of term classes for information retrieval is examined. While the process is proven to be difficult computationally, heuristic methods are applied. Experimental results are obtained to illustrate the maximum possible improvement in system performance of retrieval using the formal…

  2. On the Formal Induction of the Press into the Scientific Process.

    ERIC Educational Resources Information Center

    Bausell, R. Barker

    1990-01-01

    Current policies concerning popular dissemination of research results are critiqued, and the formal induction of the press into the scientific process is discussed. Repercussions of a "New England Journal of Medicine" article on the effects of oat bran and low-fiber wheat on health are used to illustrate the issues. (TJH)

  3. Towards a category theory approach to analogy: Analyzing re-representation and acquisition of numerical knowledge.

    PubMed

    Navarrete, Jairo A; Dartnell, Pablo

    2017-08-01

    Category Theory, a branch of mathematics, has shown promise as a modeling framework for higher-level cognition. We introduce an algebraic model for analogy that uses the language of category theory to explore analogy-related cognitive phenomena. To illustrate the potential of this approach, we use this model to explore three objects of study in cognitive literature. First, (a) we use commutative diagrams to analyze an effect of playing particular educational board games on the learning of numbers. Second, (b) we employ a notion called coequalizer as a formal model of re-representation that explains a property of computational models of analogy called "flexibility" whereby non-similar representational elements are considered matches and placed in structural correspondence. Finally, (c) we build a formal learning model which shows that re-representation, language processing and analogy making can explain the acquisition of knowledge of rational numbers. These objects of study provide a picture of acquisition of numerical knowledge that is compatible with empirical evidence and offers insights on possible connections between notions such as relational knowledge, analogy, learning, conceptual knowledge, re-representation and procedural knowledge. This suggests that the approach presented here facilitates mathematical modeling of cognition and provides novel ways to think about analogy-related cognitive phenomena.

  4. Modeling Of Object- And Scene-Prototypes With Hierarchically Structured Classes

    NASA Astrophysics Data System (ADS)

    Ren, Z.; Jensch, P.; Ameling, W.

    1989-03-01

    The success of knowledge-based image analysis methodology and implementation tools depends largely on an appropriately and efficiently built model wherein the domain-specific context information about and the inherent structure of the observed image scene have been encoded. For identifying an object in an application environment a computer vision system needs to know firstly the description of the object to be found in an image or in an image sequence, secondly the corresponding relationships between object descriptions within the image sequence. This paper presents models of image objects scenes by means of hierarchically structured classes. Using the topovisual formalism of graph and higraph, we are currently studying principally the relational aspect and data abstraction of the modeling in order to visualize the structural nature resident in image objects and scenes, and to formalize. their descriptions. The goal is to expose the structure of image scene and the correspondence of image objects in the low level image interpretation. process. The object-based system design approach has been applied to build the model base. We utilize the object-oriented programming language C + + for designing, testing and implementing the abstracted entity classes and the operation structures which have been modeled topovisually. The reference images used for modeling prototypes of objects and scenes are from industrial environments as'well as medical applications.

  5. Towards a category theory approach to analogy: Analyzing re-representation and acquisition of numerical knowledge

    PubMed Central

    2017-01-01

    Category Theory, a branch of mathematics, has shown promise as a modeling framework for higher-level cognition. We introduce an algebraic model for analogy that uses the language of category theory to explore analogy-related cognitive phenomena. To illustrate the potential of this approach, we use this model to explore three objects of study in cognitive literature. First, (a) we use commutative diagrams to analyze an effect of playing particular educational board games on the learning of numbers. Second, (b) we employ a notion called coequalizer as a formal model of re-representation that explains a property of computational models of analogy called “flexibility” whereby non-similar representational elements are considered matches and placed in structural correspondence. Finally, (c) we build a formal learning model which shows that re-representation, language processing and analogy making can explain the acquisition of knowledge of rational numbers. These objects of study provide a picture of acquisition of numerical knowledge that is compatible with empirical evidence and offers insights on possible connections between notions such as relational knowledge, analogy, learning, conceptual knowledge, re-representation and procedural knowledge. This suggests that the approach presented here facilitates mathematical modeling of cognition and provides novel ways to think about analogy-related cognitive phenomena. PMID:28841643

  6. Ensuring Cross-Cultural Equivalence in Translation of Research Consents and Clinical Documents

    PubMed Central

    Lee, Cheng-Chih; Li, Denise; Arai, Shoshana; Puntillo, Kathleen

    2010-01-01

    The aim of this article is to describe a formal process used to translate research study materials from English into traditional Chinese characters. This process may be useful for translating documents for use by both research participants and clinical patients. A modified Brislin model was used as the systematic translation process. Four bilingual translators were involved, and a Flaherty 3-point scale was used to evaluate the translated documents. The linguistic discrepancies that arise in the process of ensuring cross-cultural congruency or equivalency between the two languages are presented to promote the development of patient-accessible cross-cultural documents. PMID:18948451

  7. Ten Commandments of Formal Methods...Ten Years Later

    NASA Technical Reports Server (NTRS)

    Bowen, Jonathan P.; Hinchey, Michael G.

    2006-01-01

    More than a decade ago, in "Ten Commandments of Formal Methods," we offered practical guidelines for projects that sought to use formal methods. Over the years, the article, which was based on our knowledge of successful industrial projects, has been widely cited and has generated much positive feedback. However, despite this apparent enthusiasm, formal methods use has not greatly increased, and some of the same attitudes about the infeasibility of adopting them persist. Formal methodists believe that introducing greater rigor will improve the software development process and yield software with better structure, greater maintainability, and fewer errors.

  8. Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report

    NASA Technical Reports Server (NTRS)

    Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael

    2017-01-01

    This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.

  9. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  10. Robustness of the non-Markovian Alzheimer walk under stochastic perturbation

    NASA Astrophysics Data System (ADS)

    Cressoni, J. C.; da Silva, L. R.; Viswanathan, G. M.; da Silva, M. A. A.

    2012-12-01

    The elephant walk model originally proposed by Schütz and Trimper to investigate non-Markovian processes led to the investigation of a series of other random-walk models. Of these, the best known is the Alzheimer walk model, because it was the first model shown to have amnestically induced persistence —i.e. superdiffusion caused by loss of memory. Here we study the robustness of the Alzheimer walk by adding a memoryless stochastic perturbation. Surprisingly, the solution of the perturbed model can be formally reduced to the solutions of the unperturbed model. Specifically, we give an exact solution of the perturbed model by finding a surjective mapping to the unperturbed model.

  11. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  12. Defining Uniform Processes for Remediation, Probation and Termination in Residency Training

    PubMed Central

    Smith, Jessica L.; Lypson, Monica; Silverberg, Mark; Weizberg, Moshe; Murano, Tiffany; Lukela, Michael; Santen, Sally A.

    2017-01-01

    It is important that residency programs identify trainees who progress appropriately, as well as identify residents who fail to achieve educational milestones as expected so they may be remediated. The process of remediation varies greatly across training programs, due in part to the lack of standardized definitions for good standing, remediation, probation, and termination. The purpose of this educational advancement is to propose a clear remediation framework including definitions, management processes, documentation expectations and appropriate notifications. Informal remediation is initiated when a resident’s performance is deficient in one or more of the outcomes-based milestones established by the Accreditation Council for Graduate Medical Education, but not significant enough to trigger formal remediation. Formal remediation occurs when deficiencies are significant enough to warrant formal documentation because informal remediation failed or because issues are substantial. The process includes documentation in the resident’s file and notification of the graduate medical education office; however, the documentation is not disclosed if the resident successfully remediates. Probation is initiated when a resident is unsuccessful in meeting the terms of formal remediation or if initial problems are significant enough to warrant immediate probation. The process is similar to formal remediation but also includes documentation extending to the final verification of training and employment letters. Termination involves other stakeholders and occurs when a resident is unsuccessful in meeting the terms of probation or if initial problems are significant enough to warrant immediate termination. PMID:28116019

  13. Morphological computation and morphological control: steps toward a formal theory and applications.

    PubMed

    Füchslin, Rudolf M; Dzyakanchuk, Andrej; Flumini, Dandolo; Hauser, Helmut; Hunt, Kenneth J; Luchsinger, Rolf H; Reller, Benedikt; Scheidegger, Stephan; Walker, Richard

    2013-01-01

    Morphological computation can be loosely defined as the exploitation of the shape, material properties, and physical dynamics of a physical system to improve the efficiency of a computation. Morphological control is the application of morphological computing to a control task. In its theoretical part, this article sharpens and extends these definitions by suggesting new formalized definitions and identifying areas in which the definitions we propose are still inadequate. We go on to describe three ongoing studies, in which we are applying morphological control to problems in medicine and in chemistry. The first involves an inflatable support system for patients with impaired movement, and is based on macroscopic physics and concepts already tested in robotics. The two other case studies (self-assembly of chemical microreactors; models of induced cell repair in radio-oncology) describe processes and devices on the micrometer scale, in which the emergent dynamics of the underlying physical system (e.g., phase transitions) are dominated by stochastic processes such as diffusion.

  14. A model for medical decision making and problem solving.

    PubMed

    Werner, M

    1995-08-01

    Clinicians confront the classical problem of decision making under uncertainty, but a universal procedure by which they deal with this situation, both in diagnosis and therapy, can be defined. This consists in the choice of a specific course of action from available alternatives so as to reduce uncertainty. Formal analysis evidences that the expected value of this process depends on the a priori probabilities confronted, the discriminatory power of the action chosen, and the values and costs associated with possible outcomes. Clinical problem-solving represents the construction of a systematic strategy from multiple decisional building blocks. Depending on the level of uncertainty the physicians attach to their working hypothesis, they can choose among at least four prototype strategies: pattern recognition, the hypothetico-deductive process, arborization, and exhaustion. However, the resolution of real-life problems can involve a combination of these game plans. Formal analysis of each strategy permits definition of its appropriate a priori probabilities, action characteristics, and cost implications.

  15. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  16. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  17. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  18. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  19. A Framework for Modeling Competitive and Cooperative Computation in Retinal Processing

    NASA Astrophysics Data System (ADS)

    Moreno-Díaz, Roberto; de Blasio, Gabriel; Moreno-Díaz, Arminda

    2008-07-01

    The structure of the retina suggests that it should be treated (at least from the computational point of view), as a layered computer. Different retinal cells contribute to the coding of the signals down to ganglion cells. Also, because of the nature of the specialization of some ganglion cells, the structure suggests that all these specialization processes should take place at the inner plexiform layer and they should be of a local character, prior to a global integration and frequency-spike coding by the ganglion cells. The framework we propose consists of a layered computational structure, where outer layers provide essentially with band-pass space-time filtered signals which are progressively delayed, at least for their formal treatment. Specialization is supposed to take place at the inner plexiform layer by the action of spatio-temporal microkernels (acting very locally), and having a centerperiphery space-time structure. The resulting signals are then integrated by the ganglion cells through macrokernels structures. Practically all types of specialization found in different vertebrate retinas, as well as the quasilinear behavior in some higher vertebrates, can be modeled and simulated within this framework. Finally, possible feedback from central structures is considered. Though their relevance to retinal processing is not definitive, it is included here for the sake of completeness, since it is a formal requisite for recursiveness.

  20. Improved formalism for precision Higgs coupling fits

    NASA Astrophysics Data System (ADS)

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Karl, Robert; List, Jenny; Ogawa, Tomohisa; Peskin, Michael E.; Tian, Junping

    2018-03-01

    Future e+e- colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e+e- data, based on the effective field theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e+e- colliders.

  1. Directly executable formal models of middleware for MANET and Cloud Networking and Computing

    NASA Astrophysics Data System (ADS)

    Pashchenko, D. V.; Sadeq Jaafar, Mustafa; Zinkin, S. A.; Trokoz, D. A.; Pashchenko, T. U.; Sinev, M. P.

    2016-04-01

    The article considers some “directly executable” formal models that are suitable for the specification of computing and networking in the cloud environment and other networks which are similar to wireless networks MANET. These models can be easily programmed and implemented on computer networks.

  2. Applying Automated Theorem Proving to Computer Security

    DTIC Science & Technology

    2008-03-01

    CS96]”. Violations of policy can also be specified in this model. La Padula [Pad90] discusses a domain-independent formal model which imple- ments a...Science Laboratory, SRI International, Menlo Park, CA, September 1999. Pad90. L.J. La Padula . Formal modeling in a generalized framework for ac- cess

  3. Formal mechanization of device interactions with a process algebra

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, Karl; Cohen, Gerald C.

    1992-01-01

    The principle emphasis is to develop a methodology to formally verify correct synchronization communication of devices in a composed hardware system. Previous system integration efforts have focused on vertical integration of one layer on top of another. This task examines 'horizontal' integration of peer devices. To formally reason about communication, we mechanize a process algebra in the Higher Order Logic (HOL) theorem proving system. Using this formalization we show how four types of device interactions can be represented and verified to behave as specified. The report also describes the specification of a system consisting of an AVM-1 microprocessor and a memory management unit which were verified in previous work. A proof of correct communication is presented, and the extensions to the system specification to add a direct memory device are discussed.

  4. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  5. Adolescent thinking ála Piaget: The formal stage.

    PubMed

    Dulit, E

    1972-12-01

    Two of the formal-stage experiments of Piaget and Inhelder, selected largely for their closeness to the concepts defining the stage, were replicated with groups of average and gifted adolescents. This report describes the relevant Piagetian concepts (formal stage, concrete stage) in context, gives the methods and findings of this study, and concludes with a section discussing implications and making some reformulations which generally support but significantly qualify some of the central themes of the Piaget-Inhelder work. Fully developed formal-stage thinking emerges as far from commonplace among normal or average adolescents (by marked contrast with the impression created by the Piaget-Inhelder text, which chooses to report no middle or older adolescents who function at less than fully formal levels). In this respect, the formal stage differs appreciably from the earlier Piagetian stages, and early adolescence emerges as the age for which a "single path" model of cognitive development becomes seriously inadequate and a more complex model becomes essential. Formal-stage thinking seems best conceptualized, like most other aspects of psychological maturity, as a potentiality only partially attained by most and fully attained only by some.

  6. Towards Formal Implementation of PUS Standard

    NASA Astrophysics Data System (ADS)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  7. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  8. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  9. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  10. Production model in the conditions of unstable demand taking into account the influence of trading infrastructure: Ergodicity and its application

    NASA Astrophysics Data System (ADS)

    Obrosova, N. K.; Shananin, A. A.

    2015-04-01

    A production model with allowance for a working capital deficit and a restricted maximum possible sales volume is proposed and analyzed. The study is motivated by an attempt to analyze the problems of functioning of low competitive macroeconomic structures. The model is formalized in the form of a Bellman equation, for which a closed-form solution is found. The stochastic process of product stock variations is proved to be ergodic and its final probability distribution is found. Expressions for the average production load and the average product stock are found by analyzing the stochastic process. A system of model equations relating the model variables to official statistical parameters is derived. The model is identified using data from the Fiat and KAMAZ companies. The influence of the credit interest rate on the firm market value assessment and the production load level are analyzed using comparative statics methods.

  11. Definition and determination of the triplet-triplet energy transfer reaction coordinate.

    PubMed

    Zapata, Felipe; Marazzi, Marco; Castaño, Obis; Acuña, A Ulises; Frutos, Luis Manuel

    2014-01-21

    A definition of the triplet-triplet energy transfer reaction coordinate within the very weak electronic coupling limit is proposed, and a novel theoretical formalism is developed for its quantitative determination in terms of internal coordinates The present formalism permits (i) the separation of donor and acceptor contributions to the reaction coordinate, (ii) the identification of the intrinsic role of donor and acceptor in the triplet energy transfer process, and (iii) the quantification of the effect of every internal coordinate on the transfer process. This formalism is general and can be applied to classical as well as to nonvertical triplet energy transfer processes. The utility of the novel formalism is demonstrated here by its application to the paradigm of nonvertical triplet-triplet energy transfer involving cis-stilbene as acceptor molecule. In this way the effect of each internal molecular coordinate in promoting the transfer rate, from triplet donors in the low and high-energy limit, could be analyzed in detail.

  12. Definition and determination of the triplet-triplet energy transfer reaction coordinate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zapata, Felipe; Marazzi, Marco; Castaño, Obis

    2014-01-21

    A definition of the triplet-triplet energy transfer reaction coordinate within the very weak electronic coupling limit is proposed, and a novel theoretical formalism is developed for its quantitative determination in terms of internal coordinates The present formalism permits (i) the separation of donor and acceptor contributions to the reaction coordinate, (ii) the identification of the intrinsic role of donor and acceptor in the triplet energy transfer process, and (iii) the quantification of the effect of every internal coordinate on the transfer process. This formalism is general and can be applied to classical as well as to nonvertical triplet energy transfermore » processes. The utility of the novel formalism is demonstrated here by its application to the paradigm of nonvertical triplet-triplet energy transfer involving cis-stilbene as acceptor molecule. In this way the effect of each internal molecular coordinate in promoting the transfer rate, from triplet donors in the low and high-energy limit, could be analyzed in detail.« less

  13. Understanding visualization: a formal approach using category theory and semiotics.

    PubMed

    Vickers, Paul; Faith, Joe; Rossiter, Nick

    2013-06-01

    This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.

  14. Dynamic Computation of Change Operations in Version Management of Business Process Models

    NASA Astrophysics Data System (ADS)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  15. Creation of system of computer-aided design for technological objects

    NASA Astrophysics Data System (ADS)

    Zubkova, T. M.; Tokareva, M. A.; Sultanov, N. Z.

    2018-05-01

    Due to the competition in the market of process equipment, its production should be flexible, retuning to various product configurations, raw materials and productivity, depending on the current market needs. This process is not possible without CAD (computer-aided design). The formation of CAD begins with planning. Synthesizing, analyzing, evaluating, converting operations, as well as visualization and decision-making operations, can be automated. Based on formal description of the design procedures, the design route in the form of an oriented graph is constructed. The decomposition of the design process, represented by the formalized description of the design procedures, makes it possible to make an informed choice of the CAD component for the solution of the task. The object-oriented approach allows us to consider the CAD as an independent system whose properties are inherited from the components. The first step determines the range of tasks to be performed by the system, and a set of components for their implementation. The second one is the configuration of the selected components. The interaction between the selected components is carried out using the CALS standards. The chosen CAD / CAE-oriented approach allows creating a single model, which is stored in the database of the subject area. Each of the integration stages is implemented as a separate functional block. The transformation of the CAD model into the model of the internal representation is realized by the block of searching for the geometric parameters of the technological machine, in which the XML-model of the construction is obtained on the basis of the feature method from the theory of image recognition. The configuration of integrated components is divided into three consecutive steps: configuring tasks, components, interfaces. The configuration of the components is realized using the theory of "soft computations" using the Mamdani fuzzy inference algorithm.

  16. Emerging Interaction of Political Processes: The Effect on a Study Abroad Program in Cuba

    ERIC Educational Resources Information Center

    Clarke, Ruth

    2007-01-01

    The emerging interaction of political processes sets the stage for the level of macro uncertainty and specific risk events that may occur in an international relationship. Strongly defined social control in Cuba, formal and informal, dominates the dynamics of the relationship, while simultaneously government, formal, action in the U.S. dominates…

  17. The Role of the Board of Education in the Process of Resource Allocation for Public Schools.

    ERIC Educational Resources Information Center

    Chichura, Elaine Marie

    Public schools as formal organizations have broad-based goals, limited resources, and a formal hierarchy with which to manage the goal achievement process. The board of education combines this organization's economic and political dimensions to provide a thorough, efficient education for all children in the state. This paper investigates the…

  18. 50 CFR 260.30 - Report of inspection results prior to issuance of formal report.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Report of inspection results prior to issuance of formal report. 260.30 Section 260.30 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE PROCESSED FISHERY PRODUCTS, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER...

  19. 50 CFR 260.30 - Report of inspection results prior to issuance of formal report.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 11 2012-10-01 2012-10-01 false Report of inspection results prior to issuance of formal report. 260.30 Section 260.30 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE PROCESSED FISHERY PRODUCTS, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER...

  20. 50 CFR 260.30 - Report of inspection results prior to issuance of formal report.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 11 2014-10-01 2014-10-01 false Report of inspection results prior to issuance of formal report. 260.30 Section 260.30 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE PROCESSED FISHERY PRODUCTS, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER...

  1. 50 CFR 260.30 - Report of inspection results prior to issuance of formal report.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 11 2013-10-01 2013-10-01 false Report of inspection results prior to issuance of formal report. 260.30 Section 260.30 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE PROCESSED FISHERY PRODUCTS, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER...

  2. 50 CFR 260.30 - Report of inspection results prior to issuance of formal report.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 9 2011-10-01 2011-10-01 false Report of inspection results prior to issuance of formal report. 260.30 Section 260.30 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE PROCESSED FISHERY PRODUCTS, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER...

  3. School Administrators' Beliefs that School Improvements Were Due to Formal School Registration: A Rasch Measurement

    ERIC Educational Resources Information Center

    Witten, Harm; Waugh, Russell; Gray, Jan

    2012-01-01

    This paper presents the results of an investigation into the attitudes of School Administrators to the relationship between formal school registration and school improvement. It concerns a mandatory inspection-type registration process for all Non-Government Schools in Western Australia. Part of the aim of this registration process was to help…

  4. Non Linear Programming (NLP) Formulation for Quantitative Modeling of Protein Signal Transduction Pathways

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239

  5. The stochastic system approach for estimating dynamic treatments effect.

    PubMed

    Commenges, Daniel; Gégout-Petit, Anne

    2015-10-01

    The problem of assessing the effect of a treatment on a marker in observational studies raises the difficulty that attribution of the treatment may depend on the observed marker values. As an example, we focus on the analysis of the effect of a HAART on CD4 counts, where attribution of the treatment may depend on the observed marker values. This problem has been treated using marginal structural models relying on the counterfactual/potential response formalism. Another approach to causality is based on dynamical models, and causal influence has been formalized in the framework of the Doob-Meyer decomposition of stochastic processes. Causal inference however needs assumptions that we detail in this paper and we call this approach to causality the "stochastic system" approach. First we treat this problem in discrete time, then in continuous time. This approach allows incorporating biological knowledge naturally. When working in continuous time, the mechanistic approach involves distinguishing the model for the system and the model for the observations. Indeed, biological systems live in continuous time, and mechanisms can be expressed in the form of a system of differential equations, while observations are taken at discrete times. Inference in mechanistic models is challenging, particularly from a numerical point of view, but these models can yield much richer and reliable results.

  6. Formal analysis and evaluation of the back-off procedure in IEEE802.11P VANET

    NASA Astrophysics Data System (ADS)

    Jin, Li; Zhang, Guoan; Zhu, Xiaojun

    2017-07-01

    The back-off procedure is one of the media access control technologies in 802.11P communication protocol. It plays an important role in avoiding message collisions and allocating channel resources. Formal methods are effective approaches for studying the performances of communication systems. In this paper, we establish a discrete time model for the back-off procedure. We use Markov Decision Processes (MDPs) to model the non-deterministic and probabilistic behaviors of the procedure, and use the probabilistic computation tree logic (PCTL) language to express different properties, which ensure that the discrete time model performs their basic functionality. Based on the model and PCTL specifications, we study the effect of contention window length on the number of senders in the neighborhood of given receivers, and that on the station’s expected cost required by the back-off procedure to successfully send packets. The variation of the window length may increase or decrease the maximum probability of correct transmissions within a time contention unit. We propose to use PRISM model checker to describe our proposed back-off procedure for IEEE802.11P protocol in vehicle network, and define different probability properties formulas to automatically verify the model and derive numerical results. The obtained results are helpful for justifying the values of the time contention unit.

  7. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    PubMed

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  8. Improved formalism for precision Higgs coupling fits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon

    Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.

  9. Improved formalism for precision Higgs coupling fits

    DOE PAGES

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; ...

    2018-03-20

    Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.

  10. An unsupervised machine learning model for discovering latent infectious diseases using social media data.

    PubMed

    Lim, Sunghoon; Tucker, Conrad S; Kumara, Soundar

    2017-02-01

    The authors of this work propose an unsupervised machine learning model that has the ability to identify real-world latent infectious diseases by mining social media data. In this study, a latent infectious disease is defined as a communicable disease that has not yet been formalized by national public health institutes and explicitly communicated to the general public. Most existing approaches to modeling infectious-disease-related knowledge discovery through social media networks are top-down approaches that are based on already known information, such as the names of diseases and their symptoms. In existing top-down approaches, necessary but unknown information, such as disease names and symptoms, is mostly unidentified in social media data until national public health institutes have formalized that disease. Most of the formalizing processes for latent infectious diseases are time consuming. Therefore, this study presents a bottom-up approach for latent infectious disease discovery in a given location without prior information, such as disease names and related symptoms. Social media messages with user and temporal information are extracted during the data preprocessing stage. An unsupervised sentiment analysis model is then presented. Users' expressions about symptoms, body parts, and pain locations are also identified from social media data. Then, symptom weighting vectors for each individual and time period are created, based on their sentiment and social media expressions. Finally, latent-infectious-disease-related information is retrieved from individuals' symptom weighting vectors. Twitter data from August 2012 to May 2013 are used to validate this study. Real electronic medical records for 104 individuals, who were diagnosed with influenza in the same period, are used to serve as ground truth validation. The results are promising, with the highest precision, recall, and F 1 score values of 0.773, 0.680, and 0.724, respectively. This work uses individuals' social media messages to identify latent infectious diseases, without prior information, quicker than when the disease(s) is formalized by national public health institutes. In particular, the unsupervised machine learning model using user, textual, and temporal information in social media data, along with sentiment analysis, identifies latent infectious diseases in a given location. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. First results on applying a non-linear effect formalism to alliances between political parties and buy and sell dynamics

    NASA Astrophysics Data System (ADS)

    Bagarello, F.; Haven, E.

    2016-02-01

    We discuss a non linear extension of a model of alliances in politics, recently proposed by one of us. The model is constructed in terms of operators, describing the interest of three parties to form, or not, some political alliance with the other parties. The time evolution of what we call the decision functions is deduced by introducing a suitable Hamiltonian, which describes the main effects of the interactions of the parties amongst themselves and with their environments, which are generated by their electors and by people who still have no clear idea for which party to vote (or even if to vote). The Hamiltonian contains some non-linear effects, which takes into account the role of a party in the decision process of the other two parties. Moreover, we show how the same Hamiltonian can also be used to construct a formal structure which can describe the dynamics of buying and selling financial assets (without however implying a specific price setting mechanism).

  12. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  13. Reference Architecture Model Enabling Standards Interoperability.

    PubMed

    Blobel, Bernd

    2017-01-01

    Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.

  14. Problem solving in the borderland between mathematics and physics

    NASA Astrophysics Data System (ADS)

    Jensen, Jens Højgaard; Niss, Martin; Jankvist, Uffe Thomas

    2017-01-01

    The article addresses the problématique of where mathematization is taught in the educational system, and who teaches it. Mathematization is usually not a part of mathematics programs at the upper secondary level, but we argue that physics teaching has something to offer in this respect, if it focuses on solving so-called unformalized problems, where a major challenge is to formalize the problems in mathematics and physics terms. We analyse four concrete examples of unformalized problems for which the formalization involves different order of mathematization and applying physics to the problem, but all require mathematization. The analysis leads to the formulation of a model by which we attempt to capture the important steps of the process of solving unformalized problems by means of mathematization and physicalization.

  15. The redefinition of the familialist home care model in France: the complex formalization of care through cash payment.

    PubMed

    Le Bihan, Blanche

    2012-05-01

    This article investigates the impact of policy measures on the organisation of home-based care for older people in France, by examining the balance between formal and informal care and the redefinition of the initial familialist model. It focuses on the specific cash for care scheme (the Allocation personnalisée d'autonomie - Personalised allowance for autonomy) which is at the core of the French home-based care policy. The author argues that in a redefined context of 'welfare mix', the French public strategy for supporting home-based care in France is articulated around two major objectives, which can appear contradictory. It aims to formalise a professional care sector, with respect to the employment policy while allowing the development of new forms of informal care, which cannot be considered to be formal employment. The data collection is two-fold. Firstly, a detailed analysis was made of different policy documents and public reports, together with a systematic review of existing studies. Secondly, statistical analysis on home-based care resources were collected, which was not easy, as home-care services for older people in France are part of a larger sector of activity, 'personal services' (services à la personne). The article exposes three main findings. First, it highlights the complexity of the formalisation process related to the introduction of the French care allowance and demonstrates that formalisation, which facilitates the recognition of care as work, does not necessarily mean professionalisation. Second, it outlines the diversity of the resources available: heterogeneous professional care, semi-formal forms of care work with the possibility to employ a relative and informal family care. Finally, the analysis outlines the importance of the regulation of cash payments on the reshaping of formal and informal care and comments on its impact on the redefinition of informal caring activities. © 2012 Blackwell Publishing Ltd.

  16. Does informal care reduce public care expenditure on elderly care? Estimates based on Finland’s Age Study

    PubMed Central

    2013-01-01

    Background To formulate sustainable long-term care policies, it is critical first to understand the relationship between informal care and formal care expenditure. The aim of this paper is to examine to what extent informal care reduces public expenditure on elderly care. Methods Data from a geriatric rehabilitation program conducted in Finland (Age Study, n = 732) were used to estimate the annual public care expenditure on elderly care. We first constructed hierarchical multilevel regression models to determine the factors associated with elderly care expenditure. Second, we calculated the adjusted mean costs of care in four care patterns: 1) informal care only for elderly living alone; 2) informal care only from a co-resident family member; 3) a combination of formal and informal care; and 4) formal care only. We included functional independence and health-related quality of life (15D score) measures into our models. This method standardizes the care needs of a heterogeneous subject group and enabled us to compare expenditure among various care categories even when differences were observed in the subjects’ physical health. Results Elder care that consisted of formal care only had the highest expenditure at 25,300 Euros annually. The combination of formal and informal care had an annual expenditure of 22,300 Euros. If a person received mainly informal care from a co-resident family member, then the annual expenditure was only 4,900 Euros and just 6,000 Euros for a person living alone and receiving informal care. Conclusions Our analysis of a frail elderly Finnish population shows that the availability of informal care considerably reduces public care expenditure. Therefore, informal care should be taken into account when formulating policies for long-term care. The process whereby families choose to provide care for their elderly relatives has a significant impact on long-term care expenditure. PMID:23947622

  17. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  18. Molecular collision processes in the presence of picosecond laser pulses

    NASA Technical Reports Server (NTRS)

    Lee, H. W.; George, T. F.

    1979-01-01

    Radiative transitions in molecular collision processes taking place in the presence of picosecond pulses are studied within a semiclassical formalism. An expression for adiabatic potential surfaces in the electronic-field representation is obtained, which directly leads to the evaluation of transition probabilities. Calculations with a Landau-Zener-type model indicate that picosecond pulses can be much more effective in inducing transitions than a single long pulse of the same intensity and the same total energy, if the intensity is sufficiently high that the perturbation treatment is not valid.

  19. δ M formalism and anisotropic chaotic inflation power spectrum

    NASA Astrophysics Data System (ADS)

    Talebian-Ashkezari, A.; Ahmadi, N.

    2018-05-01

    A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.

  20. Learning Goal Orientation, Formal Mentoring, and Leadership Competence in HRD: A Conceptual Model

    ERIC Educational Resources Information Center

    Kim, Sooyoung

    2007-01-01

    Purpose: The purpose of this paper is to suggest a conceptual model of formal mentoring as a leadership development initiative including "learning goal orientation", "mentoring functions", and "leadership competencies" as key constructs of the model. Design/methodology/approach: Some empirical studies, though there are not many, will provide…

  1. Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)

    NASA Astrophysics Data System (ADS)

    Bishop, M. P.; Houser, C.; Lemmons, K.

    2015-12-01

    Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.

  2. Qualitative dynamics semantics for SBGN process description.

    PubMed

    Rougny, Adrien; Froidevaux, Christine; Calzone, Laurence; Paulevé, Loïc

    2016-06-16

    Qualitative dynamics semantics provide a coarse-grain modeling of networks dynamics by abstracting away kinetic parameters. They allow to capture general features of systems dynamics, such as attractors or reachability properties, for which scalable analyses exist. The Systems Biology Graphical Notation Process Description language (SBGN-PD) has become a standard to represent reaction networks. However, no qualitative dynamics semantics taking into account all the main features available in SBGN-PD had been proposed so far. We propose two qualitative dynamics semantics for SBGN-PD reaction networks, namely the general semantics and the stories semantics, that we formalize using asynchronous automata networks. While the general semantics extends standard Boolean semantics of reaction networks by taking into account all the main features of SBGN-PD, the stories semantics allows to model several molecules of a network by a unique variable. The obtained qualitative models can be checked against dynamical properties and therefore validated with respect to biological knowledge. We apply our framework to reason on the qualitative dynamics of a large network (more than 200 nodes) modeling the regulation of the cell cycle by RB/E2F. The proposed semantics provide a direct formalization of SBGN-PD networks in dynamical qualitative models that can be further analyzed using standard tools for discrete models. The dynamics in stories semantics have a lower dimension than the general one and prune multiple behaviors (which can be considered as spurious) by enforcing the mutual exclusiveness between the activity of different nodes of a same story. Overall, the qualitative semantics for SBGN-PD allow to capture efficiently important dynamical features of reaction network models and can be exploited to further refine them.

  3. Auditory and visual cortex of primates: a comparison of two sensory systems

    PubMed Central

    Rauschecker, Josef P.

    2014-01-01

    A comparative view of the brain, comparing related functions across species and sensory systems, offers a number of advantages. In particular, it allows separating the formal purpose of a model structure from its implementation in specific brains. Models of auditory cortical processing can be conceived by analogy to the visual cortex, incorporating neural mechanisms that are found in both the visual and auditory systems. Examples of such canonical features on the columnar level are direction selectivity, size/bandwidth selectivity, as well as receptive fields with segregated versus overlapping on- and off-sub-regions. On a larger scale, parallel processing pathways have been envisioned that represent the two main facets of sensory perception: 1) identification of objects and 2) processing of space. Expanding this model in terms of sensorimotor integration and control offers an overarching view of cortical function independent of sensory modality. PMID:25728177

  4. Non-Formal Education in Poland and Canada--Compared: A Brief Commentary

    ERIC Educational Resources Information Center

    Butler, Norman L.; Griffith, Kimberly Grantham; Kritsonis, William Allan

    2007-01-01

    The purpose of this brief note is to compare non-formal education in Poland and Canada in terms of accessibility, and it is motivated by the fact that learning is a lifelong process because of rapid advances in technology. The theoretical framework for this commentary is supplied by the general idea that non-formal learning provides a social…

  5. An Investigation of the Mathematical Models of Piaget's Psychological Theory of Cognitive Learning. Final Report.

    ERIC Educational Resources Information Center

    Kalechofsky, Robert

    This research paper proposes several mathematical models which help clarify Piaget's theory of cognition on the concrete and formal operational stages. Some modified lattice models were used for the concrete stage and a combined Boolean Algebra and group theory model was used for the formal stage. The researcher used experiments cited in the…

  6. Uncertainty and inference in the world of paleoecological data

    NASA Astrophysics Data System (ADS)

    McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.

    2017-12-01

    Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a coarseness in statistical models of taphonomic process. Each of these features provides useful opportunities for statisticians and data-generating researchers to assess what we know about the signal and the noise in paleo data and to improve inference about past changes in ecosystem state.

  7. A Flush Toilet Model for the Transistor

    NASA Astrophysics Data System (ADS)

    Organtini, Giovanni

    2012-04-01

    In introductory physics textbooks, diodes working principles are usually well described in a relatively simple manner. According to our experience, they are well understood by students. Even when no formal derivation of the physics laws governing the current flow through a diode is given, the use of this device as a check valve is easily accepted. This is not true for transistors. In most textbooks the behavior of a transistor is given without formal explanation. When the amplification is computed, for some reason, students have difficulties in identifying the basic physical mechanisms that give rise to such an effect. In this paper we give a simple and captivating illustration of the working principles of a transistor as an amplifier, tailored to high school students even with almost no background in electronics nor in modern physics. We assume that the target audience is familiar with the idea that a diode works as a check valve for currents. The lecture emphasis is on the illustration of physics principles governing the behavior of a transistor, rather than on a formal description of the processes leading to amplification.

  8. A hierarchical competing systems model of the emergence and early development of executive function

    PubMed Central

    Marcovitch, Stuart; Zelazo, Philip David

    2010-01-01

    The hierarchical competing systems model (HCSM) provides a framework for understanding the emergence and early development of executive function – the cognitive processes underlying the conscious control of behavior – in the context of search for hidden objects. According to this model, behavior is determined by the joint influence of a developmentally invariant habit system and a conscious representational system that becomes increasingly influential as children develop. This article describes a computational formalization of the HCSM, reviews behavioral and computational research consistent with the model, and suggests directions for future research on the development of executive function. PMID:19120405

  9. An advanced environment for hybrid modeling of biological systems based on modelica.

    PubMed

    Pross, Sabrina; Bachmann, Bernhard

    2011-01-20

    Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.

  10. Development of Semantic Description for Multiscale Models of Thermo-Mechanical Treatment of Metal Alloys

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Regulski, Krzysztof

    2016-08-01

    We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.

  11. A study of the linear free energy model for DNA structures using the generalized Hamiltonian formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yavari, M., E-mail: yavari@iaukashan.ac.ir

    2016-06-15

    We generalize the results of Nesterenko [13, 14] and Gogilidze and Surovtsev [15] for DNA structures. Using the generalized Hamiltonian formalism, we investigate solutions of the equilibrium shape equations for the linear free energy model.

  12. Modelling and simulating reaction-diffusion systems using coloured Petri nets.

    PubMed

    Liu, Fei; Blätke, Mary-Ann; Heiner, Monika; Yang, Ming

    2014-10-01

    Reaction-diffusion systems often play an important role in systems biology when developmental processes are involved. Traditional methods of modelling and simulating such systems require substantial prior knowledge of mathematics and/or simulation algorithms. Such skills may impose a challenge for biologists, when they are not equally well-trained in mathematics and computer science. Coloured Petri nets as a high-level and graphical language offer an attractive alternative, which is easily approachable. In this paper, we investigate a coloured Petri net framework integrating deterministic, stochastic and hybrid modelling formalisms and corresponding simulation algorithms for the modelling and simulation of reaction-diffusion processes that may be closely coupled with signalling pathways, metabolic reactions and/or gene expression. Such systems often manifest multiscaleness in time, space and/or concentration. We introduce our approach by means of some basic diffusion scenarios, and test it against an established case study, the Brusselator model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Development of the International Classification of Functioning, Disability and Health core sets for hand conditions--results of the World Health Organization International Consensus process.

    PubMed

    Rudolf, Klaus-Dieter; Kus, Sandra; Chung, Kevin C; Johnston, Marie; LeBlanc, Monique; Cieza, Alarcos

    2012-01-01

    A formal decision-making and consensus process was applied to develop the first version of the International Classification on Functioning, Disability and Health (ICF) Core Sets for Hand Conditions. To convene an international panel to develop the ICF Core Sets for Hand Conditions (HC), preparatory studies were conducted, which included an expert survey, a systematic literature review, a qualitative study and an empirical data collection process involving persons with hand conditions. A consensus conference was convened in Switzerland in May 2009 that was attended by 23 healthcare professionals, who treat hand conditions, representing 22 countries. The preparatory studies identified a set of 743 ICF categories at the second, third or fourth hierarchical level. Altogether, 117 chapter-, second-, or third-level categories were included in the comprehensive ICF Core Set for HC. The brief ICF Core Set for HC included a total of 23 chapter- and second-level categories. A formal consensus process integrating evidence and expert opinion based on the ICF led to the formal adoption of the ICF Core Sets for Hand Conditions. The next phase of this ICF project is to conduct a formal validation process to establish its applicability in clinical settings.

  14. What can formal methods offer to digital flight control systems design

    NASA Technical Reports Server (NTRS)

    Good, Donald I.

    1990-01-01

    Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macintosh, Andrew, E-mail: andrew.macintosh@anu.edu.au; Waugh, Lauren

    Concerns about the effectiveness of environmental impact assessment (EIA) have prompted proposals to improve its performance by limiting the discretion of decision-makers in screening. To investigate whether such proposals are likely to generate the desired results, we conducted an evaluation of the screening process under the Australian government's EIA regime from its introduction on 16 July 2000 to 30 June 2013 (study period). Almost 1 in 5 ‘particular manner’ decisions—a type of screening decision under the regime—were found to be unlawful. The extent of non-compliance is explained on the basis of convenience. The department was required to assess a largemore » number of projects under tight timeframes and with limited resources, while being pressured by proponents to allow their projects to bypass EIA. These pressures resulted in the development of an informal custom whereby the formal compensatory mitigation restrictions were frequently ignored. The results highlight the relative significance of formal and informal institutions in EIA. Formal EIA rules typically provide a mere outline of the process. The informal institutions adopted by administrators often have a greater influence on how the process operates and what it achieves. - Highlights: • Concerns about the effectiveness of environmental impact assessment (EIA) have prompted proposals to improve its performance by limiting the discretion of decision-makers in screening. • To investigate whether such proposals are likely to generate the desired results, we conducted an evaluation of the Australian government's screening process, looking at the extent of compliance with a formal prohibition on the consideration of compensatory mitigation. • Almost 1 in 5 ‘particular manner’ decisions – a type of screening decision under the regime – were found to be unlawful (with a 95% confidence interval of between 1:4 and 1:7) because of a failure to abide by the compensatory mitigation restrictions. For urban development actions, the ratio between lawful and unlawful particular manner decisions was worse than 1:1. • The extent of non-compliance is explained on the basis of convenience. Budget and interest group pressures resulted in the development of an informal custom whereby the formal compensatory mitigation restrictions were ignored. • The results highlight the relative significance of formal and informal institutions in EIA. Formal EIA rules typically provide a mere outline of the process. The informal institutions adopted by administrators often have a greater influence on how the process operates and what it achieves.« less

  16. Stepwise construction of a metabolic network in Event-B: The heat shock response.

    PubMed

    Sanwal, Usman; Petre, Luigia; Petre, Ion

    2017-12-01

    There is a high interest in constructing large, detailed computational models for biological processes. This is often done by putting together existing submodels and adding to them extra details/knowledge. The result of such approaches is usually a model that can only answer questions on a very specific level of detail, and thus, ultimately, is of limited use. We focus instead on an approach to systematically add details to a model, with formal verification of its consistency at each step. In this way, one obtains a set of reusable models, at different levels of abstraction, to be used for different purposes depending on the question to address. We demonstrate this approach using Event-B, a computational framework introduced to develop formal specifications of distributed software systems. We first describe how to model generic metabolic networks in Event-B. Then, we apply this method for modeling the biological heat shock response in eukaryotic cells, using Event-B refinement techniques. The advantage of using Event-B consists in having refinement as an intrinsic feature; this provides as a final result not only a correct model, but a chain of models automatically linked by refinement, each of which is provably correct and reusable. This is a proof-of-concept that refinement in Event-B is suitable for biomodeling, serving for mastering biological complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Photoelectron angular distributions for states of any mixed character: An experiment-friendly model for atomic, molecular, and cluster anions

    NASA Astrophysics Data System (ADS)

    Khuseynov, Dmitry; Blackstone, Christopher C.; Culberson, Lori M.; Sanov, Andrei

    2014-09-01

    We present a model for laboratory-frame photoelectron angular distributions in direct photodetachment from (in principle) any molecular orbital using linearly polarized light. A transparent mathematical approach is used to generalize the Cooper-Zare central-potential model to anionic states of any mixed character. In the limit of atomic-anion photodetachment, the model reproduces the Cooper-Zare formula. In the case of an initial orbital described as a superposition of s and p-type functions, the model yields the previously obtained s-p mixing formula. The formalism is further advanced using the Hanstorp approximation, whereas the relative scaling of the partial-wave cross-sections is assumed to follow the Wigner threshold law. The resulting model describes the energy dependence of photoelectron anisotropy for any atomic, molecular, or cluster anions, usually without requiring a direct calculation of the transition dipole matrix elements. As a benchmark case, we apply the p-d variant of the model to the experimental results for NO- photodetachment and show that the observed anisotropy trend is described well using physically meaningful values of the model parameters. Overall, the presented formalism delivers insight into the photodetachment process and affords a new quantitative strategy for analyzing the photoelectron angular distributions and characterizing mixed-character molecular orbitals using photoelectron imaging spectroscopy of negative ions.

  18. Photoelectron angular distributions for states of any mixed character: an experiment-friendly model for atomic, molecular, and cluster anions.

    PubMed

    Khuseynov, Dmitry; Blackstone, Christopher C; Culberson, Lori M; Sanov, Andrei

    2014-09-28

    We present a model for laboratory-frame photoelectron angular distributions in direct photodetachment from (in principle) any molecular orbital using linearly polarized light. A transparent mathematical approach is used to generalize the Cooper-Zare central-potential model to anionic states of any mixed character. In the limit of atomic-anion photodetachment, the model reproduces the Cooper-Zare formula. In the case of an initial orbital described as a superposition of s and p-type functions, the model yields the previously obtained s-p mixing formula. The formalism is further advanced using the Hanstorp approximation, whereas the relative scaling of the partial-wave cross-sections is assumed to follow the Wigner threshold law. The resulting model describes the energy dependence of photoelectron anisotropy for any atomic, molecular, or cluster anions, usually without requiring a direct calculation of the transition dipole matrix elements. As a benchmark case, we apply the p-d variant of the model to the experimental results for NO(-) photodetachment and show that the observed anisotropy trend is described well using physically meaningful values of the model parameters. Overall, the presented formalism delivers insight into the photodetachment process and affords a new quantitative strategy for analyzing the photoelectron angular distributions and characterizing mixed-character molecular orbitals using photoelectron imaging spectroscopy of negative ions.

  19. School Administrators' Beliefs that School Improvements Were Due to Formal School Registration Guttman Scales and their Inter-Correlations

    ERIC Educational Resources Information Center

    Witten, Harm; Waugh, Russell; Gray, Jan

    2012-01-01

    This paper presents an investigation into the attitudes of School Administrators to the relationship between formal school registration and school improvement. It concerns a mandatory inspection-type registration process for all Non-Government Schools in Western Australia. Part of the aim of this registration process was to help schools improve…

  20. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  1. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  2. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  3. The Archival Photograph and Its Meaning: Formalisms for Modeling Images

    ERIC Educational Resources Information Center

    Benson, Allen C.

    2009-01-01

    This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…

  4. Kuang's Semi-Classical Formalism for Calculating Electron Capture Cross Sections: A Space- Physics Application

    NASA Technical Reports Server (NTRS)

    Barghouty, A. F.

    2014-01-01

    Accurate estimates of electroncapture cross sections at energies relevant to the modeling of the transport, acceleration, and interaction of energetic neutral atoms (ENA) in space (approximately few MeV per nucleon) and especially for multi-electron ions must rely on detailed, but computationally expensive, quantum-mechanical description of the collision process. Kuang's semi-classical approach is an elegant and efficient way to arrive at these estimates. Motivated by ENA modeling efforts for apace applications, we shall briefly present this approach along with sample applications and report on current progress.

  5. Thermodynamically Feasible Kinetic Models of Reaction Networks

    PubMed Central

    Ederer, Michael; Gilles, Ernst Dieter

    2007-01-01

    The dynamics of biological reaction networks are strongly constrained by thermodynamics. An holistic understanding of their behavior and regulation requires mathematical models that observe these constraints. However, kinetic models may easily violate the constraints imposed by the principle of detailed balance, if no special care is taken. Detailed balance demands that in thermodynamic equilibrium all fluxes vanish. We introduce a thermodynamic-kinetic modeling (TKM) formalism that adapts the concepts of potentials and forces from irreversible thermodynamics to kinetic modeling. In the proposed formalism, the thermokinetic potential of a compound is proportional to its concentration. The proportionality factor is a compound-specific parameter called capacity. The thermokinetic force of a reaction is a function of the potentials. Every reaction has a resistance that is the ratio of thermokinetic force and reaction rate. For mass-action type kinetics, the resistances are constant. Since it relies on the thermodynamic concept of potentials and forces, the TKM formalism structurally observes detailed balance for all values of capacities and resistances. Thus, it provides an easy way to formulate physically feasible, kinetic models of biological reaction networks. The TKM formalism is useful for modeling large biological networks that are subject to many detailed balance relations. PMID:17208985

  6. Software Assurance: Five Essential Considerations for Acquisition Officials

    DTIC Science & Technology

    2007-05-01

    May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or

  7. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  8. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  9. On the Adequacy of Current Empirical Evaluations of Formal Models of Categorization

    ERIC Educational Resources Information Center

    Wills, Andy J.; Pothos, Emmanuel M.

    2012-01-01

    Categorization is one of the fundamental building blocks of cognition, and the study of categorization is notable for the extent to which formal modeling has been a central and influential component of research. However, the field has seen a proliferation of noncomplementary models with little consensus on the relative adequacy of these accounts.…

  10. Reasoning with Conditionals: A Test of Formal Models of Four Theories

    ERIC Educational Resources Information Center

    Oberauer, Klaus

    2006-01-01

    The four dominant theories of reasoning from conditionals are translated into formal models: The theory of mental models (Johnson-Laird, P. N., & Byrne, R. M. J. (2002). Conditionals: a theory of meaning, pragmatics, and inference. "Psychological Review," 109, 646-678), the suppositional theory (Evans, J. S. B. T., & Over, D. E. (2004). "If."…

  11. Linguistics from the Perspective of the Theory of Models in Empirical Sciences: From Formal to Corpus Linguistics

    ERIC Educational Resources Information Center

    Grabinska, Teresa; Zielinska, Dorota

    2010-01-01

    The authors examine language from the perspective of models of empirical sciences, which discipline studies the relationship between reality, models, and formalisms. Such a perspective allows one to notice that linguistics approached within the classical framework share a number of problems with other experimental sciences studied initially…

  12. Numerical and experimental study of electron-beam coatings with modifying particles FeB and FeTi

    NASA Astrophysics Data System (ADS)

    Kryukova, Olga; Kolesnikova, Kseniya; Gal'chenko, Nina

    2016-07-01

    An experimental study of wear-resistant composite coatings based on titanium borides synthesized in the process of electron-beam welding of components thermo-reacting powders are composed of boron-containing mixture. A model of the process of electron beam coating with modifying particles of boron and titanium based on physical-chemical transformations is supposed. The dissolution process is described on the basis of formal kinetic approach. The result of numerical solution is the phase and chemical composition of the coating under nonequilibrium conditions, which is one of the important characteristics of the coating forming during electron beam processing. Qualitative agreement numerical calculations with experimental data was shown.

  13. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  14. El Dialogo en la Educacion No Formal: Su Aporte al Desarrollo Comunitario (Dialogue in Non-Formal Education: Its Contribution to Community Development).

    ERIC Educational Resources Information Center

    Linke, Hildegard

    2000-01-01

    Seeks to justify the educational practice of dialogue and its contributions to community development. Contends that the central element and cause of dialogue is constituted within the non-formal educational process. States that Paulo Freire emphasizes the role of words and its basis as a creative synthesis of theory and practice. (BT)

  15. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  16. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  17. Electromagnetic processes in nucleus-nucleus collisions relating to space radiation research

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    1992-01-01

    Most of the papers within this report deal with electromagnetic processes in nucleus-nucleus collisions which are of concern in the space radiation program. In particular, the removal of one and two nucleons via both electromagnetic and strong interaction processes has been extensively investigated. The theory of relativistic Coulomb fission has also been developed. Several papers on quark models also appear. Finally, note that the theoretical methods developed in this work have been directly applied to the task of radiation protection of astronauts. This has been done by parameterizing the theoretical formalism in such a fashion that it can be used in cosmic ray transport codes.

  18. The UK Haemophilia Doctors Organisation triennial audit of UK Comprehensive Care Haemophilia Centres.

    PubMed

    Wilde, J T

    2012-07-01

    Under the auspices of the United Kingdom Haemophilia Doctors Organisation (UKHCDO) the UK Comprehensive Care Haemophilia Centres (CCCs) have undergone a three yearly formal audit assessment since 1993. This report describes the evolution of the audit process and details the findings of the most recent audit round, the sixth since inception. The audit reports from the 2009 audit round were reviewed by the audit organizing group and a structured analysis of the data was compiled. CCCs in the UK offer a high standard of comprehensive care services. The main areas of concern were the state of the premises (seven centres), lack of dental services (seven centres), physiotherapy (seven centres) and social work support (11 centres). Major concerns were identified at eight centres requiring a formal letter from the chairman of UKHCDO to the chief executive of the host trust. Since inception of the triennial audit process centre report recommendations have resulted in major improvements in the services available at UK CCCs. The audit process is considered to be a highly effective means of improving the quality of care for patients with bleeding disorders and can be used as a model for the introduction of a similar process in other countries. © 2012 Blackwell Publishing Ltd.

  19. Toward a complete theory for predicting inclusive deuteron breakup away from stability

    NASA Astrophysics Data System (ADS)

    Potel, G.; Perdikakis, G.; Carlson, B. V.; Atkinson, M. C.; Dickhoff, W. H.; Escher, J. E.; Hussein, M. S.; Lei, J.; Li, W.; Macchiavelli, A. O.; Moro, A. M.; Nunes, F. M.; Pain, S. D.; Rotureau, J.

    2017-09-01

    We present an account of the current status of the theoretical treatment of inclusive ( d, p) reactions in the breakup-fusion formalism, pointing to some applications and making the connection with current experimental capabilities. Three independent implementations of the reaction formalism have been recently developed, making use of different numerical strategies. The codes also originally relied on two different but equivalent representations, namely the prior (Udagawa-Tamura, UT) and the post (Ichimura-Austern-Vincent, IAV) representations. The different implementations have been benchmarked for the first time, and then applied to the Ca isotopic chain. The neutron-Ca propagator is described in the Dispersive Optical Model (DOM) framework, and the interplay between elastic breakup (EB) and non-elastic breakup (NEB) is studied for three Ca isotopes at two different bombarding energies. The accuracy of the description of different reaction observables is assessed by comparing with experimental data of ( d, p) on 40,48Ca. We discuss the predictions of the model for the extreme case of an isotope (60Ca) currently unavailable experimentally, though possibly available in future facilities (nominally within production reach at FRIB). We explore the use of ( d, p) reactions as surrogates for (n,γ ) processes, by using the formalism to describe the compound nucleus formation in a (d,pγ ) reaction as a function of excitation energy, spin, and parity. The subsequent decay is then computed within a Hauser-Feshbach formalism. Comparisons between the (d,pγ ) and (n,γ ) induced gamma decay spectra are discussed to inform efforts to infer neutron captures from (d,pγ ) reactions. Finally, we identify areas of opportunity for future developments, and discuss a possible path toward a predictive reaction theory.

  20. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

Top