Sample records for logical framework approach

  1. Building capacity for evidence generation, synthesis and implementation to improve the care of mothers and babies in South East Asia: methods and design of the SEA-ORCHID Project using a logical framework approach.

    PubMed

    McDonald, Steve; Turner, Tari; Chamberlain, Catherine; Lumbiganon, Pisake; Thinkhamrop, Jadsada; Festin, Mario R; Ho, Jacqueline J; Mohammad, Hakimi; Henderson-Smart, David J; Short, Jacki; Crowther, Caroline A; Martis, Ruth; Green, Sally

    2010-07-01

    Rates of maternal and perinatal mortality remain high in developing countries despite the existence of effective interventions. Efforts to strengthen evidence-based approaches to improve health in these settings are partly hindered by restricted access to the best available evidence, limited training in evidence-based practice and concerns about the relevance of existing evidence. South East Asia--Optimising Reproductive and Child Health in Developing Countries (SEA-ORCHID) was a five-year project that aimed to determine whether a multifaceted intervention designed to strengthen the capacity for research synthesis, evidence-based care and knowledge implementation improved clinical practice and led to better health outcomes for mothers and babies. This paper describes the development and design of the SEA-ORCHID intervention plan using a logical framework approach. SEA-ORCHID used a before-and-after design to evaluate the impact of a multifaceted tailored intervention at nine sites across Thailand, Malaysia, Philippines and Indonesia, supported by three centres in Australia. We used a logical framework approach to systematically prepare and summarise the project plan in a clear and logical way. The development and design of the SEA-ORCHID project was based around the three components of a logical framework (problem analysis, project plan and evaluation strategy). The SEA-ORCHID logical framework defined the project's goal and purpose (To improve the health of mothers and babies in South East Asia and To improve clinical practice in reproductive health in South East Asia), and outlined a series of project objectives and activities designed to achieve these. The logical framework also established outcome and process measures appropriate to each level of the project plan, and guided project work in each of the participating countries and hospitals. Development of a logical framework in the SEA-ORCHID project enabled a reasoned, logical approach to the project design that ensured the project activities would achieve the desired outcomes and that the evaluation plan would assess both the process and outcome of the project. The logical framework was also valuable over the course of the project to facilitate communication, assess progress and build a shared understanding of the project activities, purpose and goal.

  2. Building capacity for evidence generation, synthesis and implementation to improve the care of mothers and babies in South East Asia: methods and design of the SEA-ORCHID Project using a logical framework approach

    PubMed Central

    2010-01-01

    Background Rates of maternal and perinatal mortality remain high in developing countries despite the existence of effective interventions. Efforts to strengthen evidence-based approaches to improve health in these settings are partly hindered by restricted access to the best available evidence, limited training in evidence-based practice and concerns about the relevance of existing evidence. South East Asia - Optimising Reproductive and Child Health in Developing Countries (SEA-ORCHID) was a five-year project that aimed to determine whether a multifaceted intervention designed to strengthen the capacity for research synthesis, evidence-based care and knowledge implementation improved clinical practice and led to better health outcomes for mothers and babies. This paper describes the development and design of the SEA-ORCHID intervention plan using a logical framework approach. Methods SEA-ORCHID used a before-and-after design to evaluate the impact of a multifaceted tailored intervention at nine sites across Thailand, Malaysia, Philippines and Indonesia, supported by three centres in Australia. We used a logical framework approach to systematically prepare and summarise the project plan in a clear and logical way. The development and design of the SEA-ORCHID project was based around the three components of a logical framework (problem analysis, project plan and evaluation strategy). Results The SEA-ORCHID logical framework defined the project's goal and purpose (To improve the health of mothers and babies in South East Asia and To improve clinical practice in reproductive health in South East Asia), and outlined a series of project objectives and activities designed to achieve these. The logical framework also established outcome and process measures appropriate to each level of the project plan, and guided project work in each of the participating countries and hospitals. Conclusions Development of a logical framework in the SEA-ORCHID project enabled a reasoned, logical approach to the project design that ensured the project activities would achieve the desired outcomes and that the evaluation plan would assess both the process and outcome of the project. The logical framework was also valuable over the course of the project to facilitate communication, assess progress and build a shared understanding of the project activities, purpose and goal. PMID:20594325

  3. Discovering Knowledge from Noisy Databases Using Genetic Programming.

    ERIC Educational Resources Information Center

    Wong, Man Leung; Leung, Kwong Sak; Cheng, Jack C. Y.

    2000-01-01

    Presents a framework that combines Genetic Programming and Inductive Logic Programming, two approaches in data mining, to induce knowledge from noisy databases. The framework is based on a formalism of logic grammars and is implemented as a data mining system called LOGENPRO (Logic Grammar-based Genetic Programming System). (Contains 34…

  4. Implementing Eco-Logical 2014-2015 Annual Report

    DOT National Transportation Integrated Search

    2015-12-01

    The Eco-Logical approach offers an ecosystem-based framework for integrated infrastructure and natural resource planning, project development, and delivery. The 2014/2015 Implementing Eco-Logical Program Annual Report provides updates on the Federal ...

  5. 2013/2014 Eco-Logical program annual report

    DOT National Transportation Integrated Search

    2014-12-01

    The Eco-Logical approach offers an ecosystem-based framework for integrated infrastructure and natural resource planning, project development, and delivery. The 2013/2014 Eco-Logical Program Annual Report provides updates on the Federal Highway Admin...

  6. Weighted Description Logics Preference Formulas for Multiattribute Negotiation

    NASA Astrophysics Data System (ADS)

    Ragone, Azzurra; di Noia, Tommaso; Donini, Francesco M.; di Sciascio, Eugenio; Wellman, Michael P.

    We propose a framework to compute the utility of an agreement w.r.t a preference set in a negotiation process. In particular, we refer to preferences expressed as weighted formulas in a decidable fragment of First-order Logic and agreements expressed as a formula. We ground our framework in Description Logics (DL) endowed with disjunction, to be compliant with Semantic Web technologies. A logic based approach to preference representation allows, when a background knowledge base is exploited, to relax the often unrealistic assumption of additive independence among attributes. We provide suitable definitions of the problem and present algorithms to compute utility in our setting. We also validate our approach through an experimental evaluation.

  7. Mixing Categories and Modal Logics in the Quantum Setting

    NASA Astrophysics Data System (ADS)

    Cinà, Giovanni

    The study of the foundations of Quantum Mechanics, especially after the advent of Quantum Computation and Information, has benefited from the application of category-theoretic tools and modal logics to the analysis of Quantum processes: we witness a wealth of theoretical frameworks casted in either of the two languages. This paper explores the interplay of the two formalisms in the peculiar context of Quantum Theory. After a review of some influential abstract frameworks, we show how different modal logic frames can be extracted from the category of finite dimensional Hilbert spaces, connecting the Categorical Quantum Mechanics approach to some modal logics that have been proposed for Quantum Computing. We then apply a general version of the same technique to two other categorical frameworks, the `topos approach' of Doering and Isham and the sheaf-theoretic work on contextuality by Abramsky and Brandenburger, suggesting how some key features can be expressed with modal languages.

  8. Modular Knowledge Representation and Reasoning in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Serafini, Luciano; Homola, Martin

    Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.

  9. Detection of epistatic effects with logic regression and a classical linear regression model.

    PubMed

    Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata

    2014-02-01

    To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.

  10. An iLab for Teaching Advanced Logic Concepts with Hardware Descriptive Languages

    ERIC Educational Resources Information Center

    Ayodele, Kayode P.; Inyang, Isaac A.; Kehinde, Lawrence O.

    2015-01-01

    One of the more interesting approaches to teaching advanced logic concepts is the use of online laboratory frameworks to provide student access to remote field-programmable devices. There is as yet, however, no conclusive evidence of the effectiveness of such an approach. This paper presents the Advanced Digital Lab, a remote laboratory based on…

  11. A logical foundation for representation of clinical data.

    PubMed Central

    Campbell, K E; Das, A K; Musen, M A

    1994-01-01

    OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805

  12. Alternating-Offers Protocol for Multi-issue Bilateral Negotiation in Semantic-Enabled Marketplaces

    NASA Astrophysics Data System (ADS)

    Ragone, Azzurra; di Noia, Tommaso; di Sciascio, Eugenio; Donini, Francesco M.

    We present a semantic-based approach to multi-issue bilateral negotiation for e-commerce. We use Description Logics to model advertisements, and relations among issues as axioms in a TBox. We then introduce a logic-based alternating-offers protocol, able to handle conflicting information, that merges non-standard reasoning services in Description Logics with utility thoery to find the most suitable agreements. We illustrate and motivate the theoretical framework, the logical language, and the negotiation protocol.

  13. Response-Time Tests of Logical-Rule Models of Categorization

    ERIC Educational Resources Information Center

    Little, Daniel R.; Nosofsky, Robert M.; Denton, Stephen E.

    2011-01-01

    A recent resurgence in logical-rule theories of categorization has motivated the development of a class of models that predict not only choice probabilities but also categorization response times (RTs; Fific, Little, & Nosofsky, 2010). The new models combine mental-architecture and random-walk approaches within an integrated framework and…

  14. A logic-based dynamic modeling approach to explicate the evolution of the central dogma of molecular biology.

    PubMed

    Jafari, Mohieddin; Ansari-Pour, Naser; Azimzadeh, Sadegh; Mirzaie, Mehdi

    It is nearly half a century past the age of the introduction of the Central Dogma (CD) of molecular biology. This biological axiom has been developed and currently appears to be all the more complex. In this study, we modified CD by adding further species to the CD information flow and mathematically expressed CD within a dynamic framework by using Boolean network based on its present-day and 1965 editions. We show that the enhancement of the Dogma not only now entails a higher level of complexity, but it also shows a higher level of robustness, thus far more consistent with the nature of biological systems. Using this mathematical modeling approach, we put forward a logic-based expression of our conceptual view of molecular biology. Finally, we show that such biological concepts can be converted into dynamic mathematical models using a logic-based approach and thus may be useful as a framework for improving static conceptual models in biology.

  15. A logic-based dynamic modeling approach to explicate the evolution of the central dogma of molecular biology

    PubMed Central

    Jafari, Mohieddin; Ansari-Pour, Naser; Azimzadeh, Sadegh; Mirzaie, Mehdi

    2017-01-01

    It is nearly half a century past the age of the introduction of the Central Dogma (CD) of molecular biology. This biological axiom has been developed and currently appears to be all the more complex. In this study, we modified CD by adding further species to the CD information flow and mathematically expressed CD within a dynamic framework by using Boolean network based on its present-day and 1965 editions. We show that the enhancement of the Dogma not only now entails a higher level of complexity, but it also shows a higher level of robustness, thus far more consistent with the nature of biological systems. Using this mathematical modeling approach, we put forward a logic-based expression of our conceptual view of molecular biology. Finally, we show that such biological concepts can be converted into dynamic mathematical models using a logic-based approach and thus may be useful as a framework for improving static conceptual models in biology. PMID:29267315

  16. Implementing neural nets with programmable logic

    NASA Technical Reports Server (NTRS)

    Vidal, Jacques J.

    1988-01-01

    Networks of Boolean programmable logic modules are presented as one purely digital class of artificial neural nets. The approach contrasts with the continuous analog framework usually suggested. Programmable logic networks are capable of handling many neural-net applications. They avoid some of the limitations of threshold logic networks and present distinct opportunities. The network nodes are called dynamically programmable logic modules. They can be implemented with digitally controlled demultiplexers. Each node performs a Boolean function of its inputs which can be dynamically assigned. The overall network is therefore a combinational circuit and its outputs are Boolean global functions of the network's input variables. The approach offers definite advantages for VLSI implementation, namely, a regular architecture with limited connectivity, simplicity of the control machinery, natural modularity, and the support of a mature technology.

  17. Interdisciplinary collaboration in gerontology and geriatrics in Latin America: conceptual approaches and health care teams.

    PubMed

    Gomez, Fernando; Curcio, Carmen Lucia

    2013-01-01

    The underlying rationale to support interdisciplinary collaboration in geriatrics and gerontology is based on the complexity of elderly care. The most important characteristic about interdisciplinary health care teams for older people in Latin America is their subjective-basis framework. In other regions, teams are organized according to a theoretical knowledge basis with well-justified priorities, functions, and long-term goals, in Latin America teams are arranged according to subjective interests on solving their problems. Three distinct approaches of interdisciplinary collaboration in gerontology are proposed. The first approach is grounded in the scientific rationalism of European origin. Denominated "logical-rational approach," its core is to identify the significance of knowledge. The second approach is grounded in pragmatism and is more associated with a North American tradition. The core of this approach consists in enhancing the skills and competences of each participant; denominated "logical-instrumental approach." The third approach denominated "logical-subjective approach" has a Latin America origin. Its core consists in taking into account the internal and emotional dimensions of the team. These conceptual frameworks based in geographical contexts will permit establishing the differences and shared characteristics of interdisciplinary collaboration in geriatrics and gerontology to look for operational answers to solve the "complex problems" of older adults.

  18. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  19. Applying gene regulatory network logic to the evolution of social behavior.

    PubMed

    Baran, Nicole M; McGrath, Patrick T; Streelman, J Todd

    2017-06-06

    Animal behavior is ultimately the product of gene regulatory networks (GRNs) for brain development and neural networks for brain function. The GRN approach has advanced the fields of genomics and development, and we identify organizational similarities between networks of genes that build the brain and networks of neurons that encode brain function. In this perspective, we engage the analogy between developmental networks and neural networks, exploring the advantages of using GRN logic to study behavior. Applying the GRN approach to the brain and behavior provides a quantitative and manipulative framework for discovery. We illustrate features of this framework using the example of social behavior and the neural circuitry of aggression.

  20. Logical Modeling and Dynamical Analysis of Cellular Networks

    PubMed Central

    Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine

    2016-01-01

    The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434

  1. Developing a performance measurement framework and indicators for community health service facilities in urban China.

    PubMed

    Wong, Sabrina T; Yin, Delu; Bhattacharyya, Onil; Wang, Bin; Liu, Liqun; Chen, Bowen

    2010-11-18

    China has had no effective and systematic information system to provide guidance for strengthening PHC (Primary Health Care) or account to citizens on progress. We report on the development of the China results-based Logic Model for Community Health Facilities and Stations (CHS) and a set of relevant PHC indicators intended to measure CHS priorities. We adapted the PHC Results Based Logic Model developed in Canada and current work conducted in the community health system in China to create the China CHS Logic Model framework. We used a staged approach by first constructing the framework and indicators and then validating their content through an interactive process involving policy analysis, critical review of relevant literature and multiple stakeholder consultation. The China CHS Logic Model includes inputs, activities, outputs and outcomes with a total of 287 detailed performance indicators. In these indicators, 31 indicators measure inputs, 64 measure activities, 105 measure outputs, and 87 measure immediate (n = 65), intermediate (n = 15), or final (n = 7) outcomes. A Logic Model framework can be useful in planning, implementation, analysis and evaluation of PHC at a system and service level. The development and content validation of the China CHS Logic Model and subsequent indicators provides a means for stronger accountability and a clearer sense of overall direction and purpose needed to renew and strengthen the PHC system in China. Moreover, this work will be useful in moving towards developing a PHC information system and performance measurement across districts in urban China, and guiding the pursuit of quality in PHC.

  2. Developing a Performance Measurement Framework and Indicators for Community Health Service Facilities in Urban China

    PubMed Central

    2010-01-01

    Background China has had no effective and systematic information system to provide guidance for strengthening PHC (Primary Health Care) or account to citizens on progress. We report on the development of the China results-based Logic Model for Community Health Facilities and Stations (CHS) and a set of relevant PHC indicators intended to measure CHS priorities. Methods We adapted the PHC Results Based Logic Model developed in Canada and current work conducted in the community health system in China to create the China CHS Logic Model framework. We used a staged approach by first constructing the framework and indicators and then validating their content through an interactive process involving policy analysis, critical review of relevant literature and multiple stakeholder consultation. Results The China CHS Logic Model includes inputs, activities, outputs and outcomes with a total of 287 detailed performance indicators. In these indicators, 31 indicators measure inputs, 64 measure activities, 105 measure outputs, and 87 measure immediate (n = 65), intermediate (n = 15), or final (n = 7) outcomes. Conclusion A Logic Model framework can be useful in planning, implementation, analysis and evaluation of PHC at a system and service level. The development and content validation of the China CHS Logic Model and subsequent indicators provides a means for stronger accountability and a clearer sense of overall direction and purpose needed to renew and strengthen the PHC system in China. Moreover, this work will be useful in moving towards developing a PHC information system and performance measurement across districts in urban China, and guiding the pursuit of quality in PHC. PMID:21087516

  3. Disability Policy Evaluation: Combining Logic Models and Systems Thinking.

    PubMed

    Claes, Claudia; Ferket, Neelke; Vandevelde, Stijn; Verlet, Dries; De Maeyer, Jessica

    2017-07-01

    Policy evaluation focuses on the assessment of policy-related personal, family, and societal changes or benefits that follow as a result of the interventions, services, and supports provided to those persons to whom the policy is directed. This article describes a systematic approach to policy evaluation based on an evaluation framework and an evaluation process that combine the use of logic models and systems thinking. The article also includes an example of how the framework and process have recently been used in policy development and evaluation in Flanders (Belgium), as well as four policy evaluation guidelines based on relevant published literature.

  4. A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN)

    PubMed Central

    Hayes, Holly; Parchman, Michael L.; Howard, Ray

    2012-01-01

    Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441

  5. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  6. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  7. Technical Assistance Model for Long-Term Systems Change: Three State Examples

    ERIC Educational Resources Information Center

    Kasprzak, Christina; Hurth, Joicey; Lucas, Anne; Marshall, Jacqueline; Terrell, Adriane; Jones, Elizabeth

    2010-01-01

    The National Early Childhood Technical Assistance Center (NECTAC) Technical Assistance (TA) Model for Long-Term Systems Change (LTSC) is grounded in conceptual frameworks in the literature on systems change and systems thinking. The NECTAC conceptual framework uses a logic model approach to change developed specifically for states' infant and…

  8. A constraint logic programming approach to associate 1D and 3D structural components for large protein complexes.

    PubMed

    Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang

    2007-01-01

    The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.

  9. Towards An Engineering Discipline of Computational Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mili, Ali; Sheldon, Frederick T; Jilani, Lamia Labed

    2007-01-01

    George Boole ushered the era of modern logic by arguing that logical reasoning does not fall in the realm of philosophy, as it was considered up to his time, but in the realm of mathematics. As such, logical propositions and logical arguments are modeled using algebraic structures. Likewise, we submit that security attributes must be modeled as formal mathematical propositions that are subject to mathematical analysis. In this paper, we approach this problem by attempting to model security attributes in a refinement-like framework that has traditionally been used to represent reliability and safety claims. Keywords: Computable security attributes, survivability, integrity,more » dependability, reliability, safety, security, verification, testing, fault tolerance.« less

  10. Cross-Cultural Counseling and Cross-Cultural Meanings: An Exploration of Morita Psychotherapy.

    ERIC Educational Resources Information Center

    Aldous, Jane L.

    1994-01-01

    Describes theoretical framework and techniques of Morita psychotherapy. Western research indicates that Asian American clients prefer active-directive, logical, rational, and structured approaches. Suggests that ethnocentric counseling approaches may be imposed upon clients of Asian origin because meanings attached to terms describing counseling…

  11. Rethinking Social Barriers to Effective Adaptive Management.

    PubMed

    West, Simon; Schultz, Lisen; Bekessy, Sarah

    2016-09-01

    Adaptive management is an approach to environmental management based on learning-by-doing, where complexity, uncertainty, and incomplete knowledge are acknowledged and management actions are treated as experiments. However, while adaptive management has received significant uptake in theory, it remains elusively difficult to enact in practice. Proponents have blamed social barriers and have called for social science contributions. We address this gap by adopting a qualitative approach to explore the development of an ecological monitoring program within an adaptive management framework in a public land management organization in Australia. We ask what practices are used to enact the monitoring program and how do they shape learning? We elicit a rich narrative through extensive interviews with a key individual, and analyze the narrative using thematic analysis. We discuss our results in relation to the concept of 'knowledge work' and Westley's (2002) framework for interpreting the strategies of adaptive managers-'managing through, in, out and up.' We find that enacting the program is conditioned by distinct and sometimes competing logics-scientific logics prioritizing experimentation and learning, public logics emphasizing accountability and legitimacy, and corporate logics demanding efficiency and effectiveness. In this context, implementing adaptive management entails practices of translation to negotiate tensions between objective and situated knowledge, external experts and organizational staff, and collegiate and hierarchical norms. Our contribution embraces the 'doing' of learning-by-doing and marks a shift from conceptualizing the social as an external barrier to adaptive management to be removed to an approach that situates adaptive management as social knowledge practice.

  12. Logic regression and its extensions.

    PubMed

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Theory Learning as Stochastic Search in the Language of Thought

    ERIC Educational Resources Information Center

    Ullman, Tomer D.; Goodman, Noah D.; Tenenbaum, Joshua B.

    2012-01-01

    We present an algorithmic model for the development of children's intuitive theories within a hierarchical Bayesian framework, where theories are described as sets of logical laws generated by a probabilistic context-free grammar. We contrast our approach with connectionist and other emergentist approaches to modeling cognitive development. While…

  14. Automata-Based Verification of Temporal Properties on Running Programs

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)

    2001-01-01

    This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  15. 'Healthy Eating and Lifestyle in Pregnancy (HELP)' trial: Process evaluation framework.

    PubMed

    Simpson, Sharon A; Cassidy, Dunla; John, Elinor

    2014-07-01

    We developed and tested in a cluster RCT a theory-driven group-based intervention for obese pregnant women. It was designed to support women to moderate weight gain during pregnancy and reduce BMI one year after birth, in addition to targeting secondary health and wellbeing outcomes. In line with MRC guidance on developing and evaluating complex interventions in health, we conducted a process evaluation alongside the trial. This paper describes the development of the process evaluation framework. This cluster RCT recruited 598 pregnant women. Women in the intervention group were invited to attend a weekly weight-management group. Following a review of relevant literature, we developed a process evaluation framework which outlined key process indicators that we wanted to address and how we would measure these. Central to the process evaluation was to understand the mechanism of effect of the intervention. We utilised a logic-modelling approach to describe the intervention which helped us focus on what potential mediators of intervention effect to measure, and how. The resulting process evaluation framework was designed to address 9 core elements; context, reach, exposure, recruitment, fidelity, recruitment, retention, contamination and theory-testing. These were assessed using a variety of qualitative and quantitative approaches. The logic model explained the processes by which intervention components bring about change in target outcomes through various mediators and theoretical pathways including self-efficacy, social support, self-regulation and motivation. Process evaluation is a key element in assessing the effect of any RCT. We developed a process evaluation framework and logic model, and the results of analyses using these will offer insights into why the intervention is or is not effective. Copyright © 2014.

  16. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  17. Involving Users to Improve the Collaborative Logical Framework

    PubMed Central

    2014-01-01

    In order to support collaboration in web-based learning, there is a need for an intelligent support that facilitates its management during the design, development, and analysis of the collaborative learning experience and supports both students and instructors. At aDeNu research group we have proposed the Collaborative Logical Framework (CLF) to create effective scenarios that support learning through interaction, exploration, discussion, and collaborative knowledge construction. This approach draws on artificial intelligence techniques to support and foster an effective involvement of students to collaborate. At the same time, the instructors' workload is reduced as some of their tasks—especially those related to the monitoring of the students behavior—are automated. After introducing the CLF approach, in this paper, we present two formative evaluations with users carried out to improve the design of this collaborative tool and thus enrich the personalized support provided. In the first one, we analyze, following the layered evaluation approach, the results of an observational study with 56 participants. In the second one, we tested the infrastructure to gather emotional data when carrying out another observational study with 17 participants. PMID:24592196

  18. Improving ontology matching with propagation strategy and user feedback

    NASA Astrophysics Data System (ADS)

    Li, Chunhua; Cui, Zhiming; Zhao, Pengpeng; Wu, Jian; Xin, Jie; He, Tianxu

    2015-07-01

    Markov logic networks which unify probabilistic graphical model and first-order logic provide an excellent framework for ontology matching. The existing approach requires a threshold to produce matching candidates and use a small set of constraints acting as filter to select the final alignments. We introduce novel match propagation strategy to model the influences between potential entity mappings across ontologies, which can help to identify the correct correspondences and produce missed correspondences. The estimation of appropriate threshold is a difficult task. We propose an interactive method for threshold selection through which we obtain an additional measurable improvement. Running experiments on a public dataset has demonstrated the effectiveness of proposed approach in terms of the quality of result alignment.

  19. Ecological resilience in lakes and the conjunction fallacy.

    PubMed

    Spears, Bryan M; Futter, Martyn N; Jeppesen, Erik; Huser, Brian J; Ives, Stephen; Davidson, Thomas A; Adrian, Rita; Angeler, David G; Burthe, Sarah J; Carvalho, Laurence; Daunt, Francis; Gsell, Alena S; Hessen, Dag O; Janssen, Annette B G; Mackay, Eleanor B; May, Linda; Moorhouse, Heather; Olsen, Saara; Søndergaard, Martin; Woods, Helen; Thackeray, Stephen J

    2017-11-01

    There is a pressing need to apply stability and resilience theory to environmental management to restore degraded ecosystems effectively and to mitigate the effects of impending environmental change. Lakes represent excellent model case studies in this respect and have been used widely to demonstrate theories of ecological stability and resilience that are needed to underpin preventative management approaches. However, we argue that this approach is not yet fully developed because the pursuit of empirical evidence to underpin such theoretically grounded management continues in the absence of an objective probability framework. This has blurred the lines between intuitive logic (based on the elementary principles of probability) and extensional logic (based on assumption and belief) in this field.

  20. A Rewriting Logic Approach to Type Inference

    NASA Astrophysics Data System (ADS)

    Ellison, Chucky; Şerbănuţă, Traian Florin; Roşu, Grigore

    Meseguer and Roşu proposed rewriting logic semantics (RLS) as a programing language definitional framework that unifies operational and algebraic denotational semantics. RLS has already been used to define a series of didactic and real languages, but its benefits in connection with defining and reasoning about type systems have not been fully investigated. This paper shows how the same RLS style employed for giving formal definitions of languages can be used to define type systems. The same term-rewriting mechanism used to execute RLS language definitions can now be used to execute type systems, giving type checkers or type inferencers. The proposed approach is exemplified by defining the Hindley-Milner polymorphic type inferencer mathcal{W} as a rewrite logic theory and using this definition to obtain a type inferencer by executing it in a rewriting logic engine. The inferencer obtained this way compares favorably with other definitions or implementations of mathcal{W}. The performance of the executable definition is within an order of magnitude of that of highly optimized implementations of type inferencers, such as that of OCaml.

  1. Minimally inconsistent reasoning in Semantic Web.

    PubMed

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.

  2. Minimally inconsistent reasoning in Semantic Web

    PubMed Central

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030

  3. Runtime verification of embedded real-time systems.

    PubMed

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  4. Rethinking Social Barriers to Effective Adaptive Management

    NASA Astrophysics Data System (ADS)

    West, Simon; Schultz, Lisen; Bekessy, Sarah

    2016-09-01

    Adaptive management is an approach to environmental management based on learning-by-doing, where complexity, uncertainty, and incomplete knowledge are acknowledged and management actions are treated as experiments. However, while adaptive management has received significant uptake in theory, it remains elusively difficult to enact in practice. Proponents have blamed social barriers and have called for social science contributions. We address this gap by adopting a qualitative approach to explore the development of an ecological monitoring program within an adaptive management framework in a public land management organization in Australia. We ask what practices are used to enact the monitoring program and how do they shape learning? We elicit a rich narrative through extensive interviews with a key individual, and analyze the narrative using thematic analysis. We discuss our results in relation to the concept of `knowledge work' and Westley's 2002) framework for interpreting the strategies of adaptive managers—`managing through, in, out and up.' We find that enacting the program is conditioned by distinct and sometimes competing logics—scientific logics prioritizing experimentation and learning, public logics emphasizing accountability and legitimacy, and corporate logics demanding efficiency and effectiveness. In this context, implementing adaptive management entails practices of translation to negotiate tensions between objective and situated knowledge, external experts and organizational staff, and collegiate and hierarchical norms. Our contribution embraces the `doing' of learning-by-doing and marks a shift from conceptualizing the social as an external barrier to adaptive management to be removed to an approach that situates adaptive management as social knowledge practice.

  5. First-order logic theory for manipulating clinical practice guidelines applied to comorbid patients: a case study.

    PubMed

    Michalowski, Martin; Wilk, Szymon; Tan, Xing; Michalowski, Wojtek

    2014-01-01

    Clinical practice guidelines (CPGs) implement evidence-based medicine designed to help generate a therapy for a patient suffering from a single disease. When applied to a comorbid patient, the concurrent combination of treatment steps from multiple CPGs is susceptible to adverse interactions in the resulting combined therapy (i.e., a therapy established according to all considered CPGs). This inability to concurrently apply CPGs has been shown to be one of the key shortcomings of CPG uptake in a clinical setting1. Several research efforts are underway to address this issue such as the K4CARE2 and GuideLine INteraction Detection Assistant (GLINDA)3 projects and our previous research on applying constraint logic programming to developing a consistent combined therapy for a comorbid patient4. However, there is no generalized framework for mitigation that effectively captures general characteristics of the problem while handling nuances such as time and ordering requirements imposed by specific CPGs. In this paper we propose a first-order logic-based (FOL) approach for developing a generalized framework of mitigation. This approach uses a meta-algorithm and entailment properties to mitigate (i.e., identify and address) adverse interactions introduced by concurrently applied CPGs. We use an illustrative case study of a patient suffering from type 2 diabetes being treated for an onset of severe rheumatoid arthritis to show the expressiveness and robustness of our proposed FOL-based approach, and we discuss its appropriateness as the basis for the generalized theory.

  6. Evaluation of properties over phylogenetic trees using stochastic logics.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2016-06-14

    Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.

  7. The Seven Silos of Accountability in Higher Education: Systematizing Multiple Logics and Fields

    ERIC Educational Resources Information Center

    Brown, Joshua Travis

    2017-01-01

    Higher education accountability is a field characterized by complexity. Prior frameworks grounded in psychometrics, economics, and history fall short in explaining the persistence and composition of its complexity. This article employs organizational theory to identify the multiple conflicting approaches of higher education accountability and…

  8. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  9. An Improved Genetic Fuzzy Logic Control Method to Reduce the Enlargement of Coal Floor Deformation in Shearer Memory Cutting Process

    PubMed Central

    Tan, Chao; Xu, Rongxin; Wang, Zhongbin; Si, Lei; Liu, Xinhua

    2016-01-01

    In order to reduce the enlargement of coal floor deformation and the manual adjustment frequency of rocker arms, an improved approach through integration of improved genetic algorithm and fuzzy logic control (GFLC) method is proposed. The enlargement of coal floor deformation is analyzed and a model is built. Then, the framework of proposed approach is built. Moreover, the constituents of GA such as tangent function roulette wheel selection (Tan-RWS) selection, uniform crossover, and nonuniform mutation are employed to enhance the performance of GFLC. Finally, two simulation examples and an industrial application example are carried out and the results indicate that the proposed method is feasible and efficient. PMID:27217824

  10. An Argumentation Framework based on Paraconsistent Logic

    NASA Astrophysics Data System (ADS)

    Umeda, Yuichi; Takahashi, Takehisa; Sawamura, Hajime

    Argumentation is the most representative of intelligent activities of humans. Therefore, it is natural to think that it could have many implications for artificial intelligence and computer science as well. Specifically, argumentation may be considered a most primitive capability for interaction among computational agents. In this paper we present an argumentation framework based on the four-valued paraconsistent logic. Tolerance and acceptance of inconsistency that this logic has as its logical feature allow for arguments on inconsistent knowledge bases with which we are often confronted. We introduce various concepts for argumentation, such as arguments, attack relations, argument justification, preferential criteria of arguments based on social norms, and so on, in a way proper to the four-valued paraconsistent logic. Then, we provide the fixpoint semantics and dialectical proof theory for our argumentation framework. We also give the proofs of the soundness and completeness.

  11. Oral health disparities and the workforce: a framework to guide innovation.

    PubMed

    Hilton, Irene V; Lester, Arlene M

    2010-06-01

    Oral health disparities currently exist in the United States, and workforce innovations have been proposed as one strategy to address these disparities. A framework is needed to logically assess the possible role of workforce as a contributor to and to analyze workforce strategies addressing the issue of oral health disparities. Using an existing framework, A Strategic Framework for Improving Racial/Ethnic Minority Health and Eliminating Racial/Ethnic Health Disparities, workforce was sequentially applied across individual, environmental/community, and system levels to identify long-term problems, contributing factors, strategies/innovation, measurable outcomes/impacts, and long-term goals. Examples of current workforce innovations were applied to the framework. Contributing factors to oral health disparities included lack of racial/ethnic diversity of the workforce, lack of appropriate training, provider distribution, and a nonuser-centered system. The framework was applied to selected workforce innovation models delineating the potential impact on contributing factors across the individual, environmental/community, and system levels. The framework helps to define expected outcomes from workforce models that would contribute to the goal of reducing oral health disparities and examine impacts across multiple levels. However, the contributing factors to oral health disparities cannot be addressed by workforce innovation alone. The Strategic Framework is a logical approach to guide workforce innovation, solutions, and identification of other aspects of the oral healthcare delivery system that need innovation in order to reduce oral health disparities.

  12. Boolean network identification from perturbation time series data combining dynamics abstraction and logic programming.

    PubMed

    Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C

    2016-11-01

    Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. POLE.VAULT: A Semantic Framework for Health Policy Evaluation and Logical Testing.

    PubMed

    Shaban-Nejad, Arash; Okhmatovskaia, Anya; Shin, Eun Kyong; Davis, Robert L; Buckeridge, David L

    2017-01-01

    The major goal of our study is to provide an automatic evaluation framework that aligns the results generated through semantic reasoning with the best available evidence regarding effective interventions to support the logical evaluation of public health policies. To this end, we have designed the POLicy EVAlUation & Logical Testing (POLE.VAULT) Framework to assist different stakeholders and decision-makers in making informed decisions about different health-related interventions, programs and ultimately policies, based on the contextual knowledge and the best available evidence at both individual and aggregate levels.

  14. Assessment: Monitoring & Evaluation in a Stabilisation Context

    DTIC Science & Technology

    2010-09-15

    http://www.oecd.org/dataoecd/23/27/35281194.pdf b. SIDA (2004), The Logical Framework Approach. A summary of the theory behind the LFA method...en_21571361_34047972_39774574 _1_1_1_1,00.pdf 3. SIDA (2004), Stefan Molund and Göran Schill, Looking Back, Moving Forward, Sida Evaluation Manual. Available at

  15. Visual unit analysis: a descriptive approach to landscape assessment

    Treesearch

    R. J. Tetlow; S. R. J. Sheppard

    1979-01-01

    Analysis of the visible attributes of landscapes is an important component of the planning process. When landscapes are at regional scale, economical and effective methodologies are critical. The Visual Unit concept appears to offer a logical and useful framework for description and evaluation. The concept subdivides landscape into coherent, spatially-defined units....

  16. Disability Policy Evaluation: Combining Logic Models and Systems Thinking

    ERIC Educational Resources Information Center

    Claes, Claudia; Ferket, Neelke; Vandevelde, Stijn; Verlet, Dries; De Maeyer, Jessica

    2017-01-01

    Policy evaluation focuses on the assessment of policy-related personal, family, and societal changes or benefits that follow as a result of the interventions, services, and supports provided to those persons to whom the policy is directed. This article describes a systematic approach to policy evaluation based on an evaluation framework and an…

  17. Simulation of Automatic Incidents Detection Algorithm on the Transport Network

    ERIC Educational Resources Information Center

    Nikolaev, Andrey B.; Sapego, Yuliya S.; Jakubovich, Anatolij N.; Berner, Leonid I.; Ivakhnenko, Andrey M.

    2016-01-01

    Management of traffic incident is a functional part of the whole approach to solving traffic problems in the framework of intelligent transport systems. Development of an effective process of traffic incident management is an important part of the transport system. In this research, it's suggested algorithm based on fuzzy logic to detect traffic…

  18. Synthesizing diverse evidence: the use of primary qualitative data analysis methods and logic models in public health reviews.

    PubMed

    Baxter, S; Killoran, A; Kelly, M P; Goyder, E

    2010-02-01

    The nature of public health evidence presents challenges for conventional systematic review processes, with increasing recognition of the need to include a broader range of work including observational studies and qualitative research, yet with methods to combine diverse sources remaining underdeveloped. The objective of this paper is to report the application of a new approach for review of evidence in the public health sphere. The method enables a diverse range of evidence types to be synthesized in order to examine potential relationships between a public health environment and outcomes. The study drew on previous work by the National Institute for Health and Clinical Excellence on conceptual frameworks. It applied and further extended this work to the synthesis of evidence relating to one particular public health area: the enhancement of employee mental well-being in the workplace. The approach utilized thematic analysis techniques from primary research, together with conceptual modelling, to explore potential relationships between factors and outcomes. The method enabled a logic framework to be built from a diverse document set that illustrates how elements and associations between elements may impact on the well-being of employees. Whilst recognizing potential criticisms of the approach, it is suggested that logic models can be a useful way of examining the complexity of relationships between factors and outcomes in public health, and of highlighting potential areas for interventions and further research. The use of techniques from primary qualitative research may also be helpful in synthesizing diverse document types. Copyright 2010 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  19. From framework to action: the DESIRE approach to combat desertification.

    PubMed

    Hessel, R; Reed, M S; Geeson, N; Ritsema, C J; van Lynden, G; Karavitis, C A; Schwilch, G; Jetten, V; Burger, P; van der Werff Ten Bosch, M J; Verzandvoort, S; van den Elsen, E; Witsenburg, K

    2014-11-01

    It has become increasingly clear that desertification can only be tackled through a multi-disciplinary approach that not only involves scientists but also stakeholders. In the DESIRE project such an approach was taken. As a first step, a conceptual framework was developed in which the factors and processes that may lead to land degradation and desertification were described. Many of these factors do not work independently, but can reinforce or weaken one another, and to illustrate these relationships sustainable management and policy feedback loops were included. This conceptual framework can be applied globally, but can also be made site-specific to take into account that each study site has a unique combination of bio-physical, socio-economic and political conditions. Once the conceptual framework was defined, a methodological framework was developed in which the methodological steps taken in the DESIRE approach were listed and their logic and sequence were explained. The last step was to develop a concrete working plan to put the project into action, involving stakeholders throughout the process. This series of steps, in full or in part, offers explicit guidance for other organizations or projects that aim to reduce land degradation and desertification.

  20. Rule-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.

  1. Logics of Business Education for Sustainability

    ERIC Educational Resources Information Center

    Andersson, Pernilla; Öhman, Johan

    2016-01-01

    This paper explores various kinds of logics of "business education for sustainability" and how these "logics" position the subject business person, based on eight teachers' reasoning of their own practices. The concept of logics developed within a discourse theoretical framework is employed to analyse the teachers' reasoning.…

  2. Questioning and Experimentation

    ERIC Educational Resources Information Center

    Mutanen, Arto

    2014-01-01

    The paper is a philosophical analysis of experimentation. The philosophical framework of the analysis is the interrogative model of inquiry developed by Hintikka. The basis of the model is explicit and well-formed logic of questions and answers. The framework allows us to formulate a flexible logic of experimentation. In particular, the formulated…

  3. A Logical Framework for Service Migration Based Survivability

    DTIC Science & Technology

    2016-06-24

    platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge

  4. A model for evaluating academic research centers: Case study of the Asian/Pacific Islander Youth Violence Prevention Center.

    PubMed

    Nishimura, Stephanie T; Hishinuma, Earl S; Goebert, Deborah A; Onoye, Jane M M; Sugimoto-Matsuda, Jeanelle J

    2018-02-01

    To provide one model for evaluating academic research centers, given their vital role in addressing public health issues. A theoretical framework is described for a comprehensive evaluation plan for research centers. This framework is applied to one specific center by describing the center's Logic Model and Evaluation Plan, including a sample of the center's activities. Formative and summative evaluation information is summarized. In addition, a summary of outcomes is provided: improved practice and policy; reduction of risk factors and increase in protective factors; reduction of interpersonal youth violence in the community; and national prototype for prevention of interpersonal youth violence. Research centers are important mechanisms to advance science and improve people's quality of life. Because of their more infrastructure-intensive and comprehensive approach, they also require substantial resources for success, and thus, also require careful accountability. It is therefore important to comprehensively evaluate these centers. As provided herein, a more systematic and structured approach utilizing logic models, an evaluation plan, and successful processes can provide research centers with a functionally useful method in their evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Counter Unmanned Aerial System Decision-Aid Logic Process (C-UAS DALP)

    DTIC Science & Technology

    decision -aid or logic process that bridges the middle elements of the kill... of use, location, general logic process , and reference mission. This is the framework for the IDEF0 functional architecture diagrams, decision -aid diagrams, logic process , and modeling and simulation....chain between detection to countermeasure response. This capstone project creates the logic for a decision process that transitions from the

  6. EAGLE can do Efficient LTL Monitoring

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We briefly present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. In this paper we show how EAGLE can do linear temporal logic (LTL) monitoring in an efficient way. We give an upper bound on the space and time complexity of this monitoring.

  7. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed

    Schulz, S; Romacker, M; Hahn, U

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.

  8. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed Central

    Schulz, S.; Romacker, M.; Hahn, U.

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335

  9. The development of a digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey Lindsay

    Instructors in electrical and computer engineering and in computer science have developed innovative methods to teach digital logic circuits. These methods attempt to increase student learning, satisfaction, and retention. Although there are readily accessible and accepted means for measuring satisfaction and retention, there are no widely accepted means for assessing student learning. Rigorous assessment of learning is elusive because differences in topic coverage, curriculum and course goals, and exam content prevent direct comparison of two teaching methods when using tools such as final exam scores or course grades. Because of these difficulties, computing educators have issued a general call for the adoption of assessment tools to critically evaluate and compare the various teaching methods. Science, Technology, Engineering, and Mathematics (STEM) education researchers commonly measure students' conceptual learning to compare how much different pedagogies improve learning. Conceptual knowledge is often preferred because all engineering courses should teach a fundamental set of concepts even if they emphasize design or analysis to different degrees. Increasing conceptual learning is also important, because students who can organize facts and ideas within a consistent conceptual framework are able to learn new information quickly and can apply what they know in new situations. If instructors can accurately assess their students' conceptual knowledge, they can target instructional interventions to remedy common problems. To properly assess conceptual learning, several researchers have developed concept inventories (CIs) for core subjects in engineering sciences. CIs are multiple-choice assessment tools that evaluate how well a student's conceptual framework matches the accepted conceptual framework of a discipline or common faulty conceptual frameworks. We present how we created and evaluated the digital logic concept inventory (DLCI).We used a Delphi process to identify the important and difficult concepts to include on the DLCI. To discover and describe common student misconceptions, we interviewed students who had completed a digital logic course. Students vocalized their thoughts as they solved digital logic problems. We analyzed the interview data using a qualitative grounded theory approach. We have administered the DLCI at several institutions and have checked the validity, reliability, and bias of the DLCI with classical testing theory procedures. These procedures consisted of follow-up interviews with students, analysis of administration results with statistical procedures, and expert feedback. We discuss these results and present the DLCI's potential for providing a meaningful tool for comparing student learning at different institutions.

  10. A framework to find the logic backbone of a biological network.

    PubMed

    Maheshwari, Parul; Albert, Réka

    2017-12-06

    Cellular behaviors are governed by interaction networks among biomolecules, for example gene regulatory and signal transduction networks. An often used dynamic modeling framework for these networks, Boolean modeling, can obtain their attractors (which correspond to cell types and behaviors) and their trajectories from an initial state (e.g. a resting state) to the attractors, for example in response to an external signal. The existing methods however do not elucidate the causal relationships between distant nodes in the network. In this work, we propose a simple logic framework, based on categorizing causal relationships as sufficient or necessary, as a complement to Boolean networks. We identify and explore the properties of complex subnetworks that are distillable into a single logic relationship. We also identify cyclic subnetworks that ensure the stabilization of the state of participating nodes regardless of the rest of the network. We identify the logic backbone of biomolecular networks, consisting of external signals, self-sustaining cyclic subnetworks (stable motifs), and output nodes. Furthermore, we use the logic framework to identify crucial nodes whose override can drive the system from one steady state to another. We apply these techniques to two biological networks: the epithelial-to-mesenchymal transition network corresponding to a developmental process exploited in tumor invasion, and the network of abscisic acid induced stomatal closure in plants. We find interesting subnetworks with logical implications in these networks. Using these subgraphs and motifs, we efficiently reduce both networks to succinct backbone structures. The logic representation identifies the causal relationships between distant nodes and subnetworks. This knowledge can form the basis of network control or used in the reverse engineering of networks.

  11. ICASE Semiannual Report, 1 April 1990 - 30 September 1990

    DTIC Science & Technology

    1990-11-01

    underlies parallel simulation protocols that synchronize based on logical time (all known approaches). This framework describes a suf- ficient set of...conducted primarily by visiting scientists from universities and from industry, who have resident appointments for limited periods of time , and by consultants...wave equation with point sources and semireflecting impedance boundary conditions. For sources that are piece- wise polynomial in time we get a finite

  12. Logic Modeling as a Tool to Prepare to Evaluate Disaster and Emergency Preparedness, Response, and Recovery in Schools

    ERIC Educational Resources Information Center

    Zantal-Wiener, Kathy; Horwood, Thomas J.

    2010-01-01

    The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…

  13. A Logical Framework for Distributed Data

    DTIC Science & Technology

    1990-11-01

    A Logical Framework for Distributed Data lLl6ll1󈧆AH43 44592 -001-05-3301 A~UTHO(S RDT&E 44043-010-37 Paul Broome and Barbara Broome 1L162618AH80... 44592 -002-46-3702 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 3.PERFORMG ORGANIZATION REPORT NUMBER U. S. Army Ballistic Research Laboratory ATTN

  14. Molecular Library Synthesis Using Complex Substrates: Expanding the Framework of Triterpenoids

    PubMed Central

    Ignatenko, Vasily A.; Han, Yong; Tochtrop, Gregory P.

    2013-01-01

    The remodelling of a natural product core framework by means of diversity-oriented synthesis (DOS) is a valuable approach to access diverse/biologically relevant chemical space and to overcome the limitations of combinatorial-type compounds. Here we provide proof of principle and a thorough conformational analysis for a general strategy whereby the inherent complexity of a starting material is used to define the regio- and stereochemical outcomes of reactions in chemical library construction. This is in contrast to the traditional DOS logic employing reaction development and catalysis to drive library diversity PMID:23245400

  15. A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations

    PubMed Central

    Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang

    2008-01-01

    Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033

  16. Combined quality function deployment and logical framework analysis to improve quality of emergency care in Malta.

    PubMed

    Buttigieg, Sandra Catherine; Dey, Prasanta Kumar; Cassar, Mary Rose

    2016-01-01

    The purpose of this paper is to develop an integrated patient-focused analytical framework to improve quality of care in accident and emergency (A & E) unit of a Maltese hospital. The study adopts a case study approach. First, a thorough literature review has been undertaken to study the various methods of healthcare quality management. Second, a healthcare quality management framework is developed using combined quality function deployment (QFD) and logical framework approach (LFA). Third, the proposed framework is applied to a Maltese hospital to demonstrate its effectiveness. The proposed framework has six steps, commencing with identifying patients' requirements and concluding with implementing improvement projects. All the steps have been undertaken with the involvement of the concerned stakeholders in the A & E unit of the hospital. The major and related problems being faced by the hospital under study were overcrowding at A & E and shortage of beds, respectively. The combined framework ensures better A & E services and patient flow. QFD identifies and analyses the issues and challenges of A & E and LFA helps develop project plans for healthcare quality improvement. The important outcomes of implementing the proposed quality improvement programme are fewer hospital admissions, faster patient flow, expert triage and shorter waiting times at the A & E unit. Increased emergency consultant cover and faster first significant medical encounter were required to start addressing the problems effectively. Overall, the combined QFD and LFA method is effective to address quality of care in A & E unit. PRACTICAL/IMPLICATIONS: The proposed framework can be easily integrated within any healthcare unit, as well as within entire healthcare systems, due to its flexible and user-friendly approach. It could be part of Six Sigma and other quality initiatives. Although QFD has been extensively deployed in healthcare setup to improve quality of care, very little has been researched on combining QFD and LFA in order to identify issues, prioritise them, derive improvement measures and implement improvement projects. Additionally, there is no research on QFD application in A & E. This paper bridges these gaps. Moreover, very little has been written on the Maltese health care system. Therefore, this study contributes demonstration of quality of emergency care in Malta.

  17. Harmonising Nursing Terminologies Using a Conceptual Framework.

    PubMed

    Jansen, Kay; Kim, Tae Youn; Coenen, Amy; Saba, Virginia; Hardiker, Nicholas

    2016-01-01

    The International Classification for Nursing Practice (ICNP®) and the Clinical Care Classification (CCC) System are standardised nursing terminologies that identify discrete elements of nursing practice, including nursing diagnoses, interventions, and outcomes. While CCC uses a conceptual framework or model with 21 Care Components to classify these elements, ICNP, built on a formal Web Ontology Language (OWL) description logic foundation, uses a logical hierarchical framework that is useful for computing and maintenance of ICNP. Since the logical framework of ICNP may not always align with the needs of nursing practice, an informal framework may be a more useful organisational tool to represent nursing content. The purpose of this study was to classify ICNP nursing diagnoses using the 21 Care Components of the CCC as a conceptual framework to facilitate usability and inter-operability of nursing diagnoses in electronic health records. Findings resulted in all 521 ICNP diagnoses being assigned to one of the 21 CCC Care Components. Further research is needed to validate the resulting product of this study with practitioners and develop recommendations for improvement of both terminologies.

  18. A new approach for investigating protein flexibility based on Constraint Logic Programming. The first application in the case of the estrogen receptor.

    PubMed

    Dal Palú, Alessandro; Spyrakis, Francesca; Cozzini, Pietro

    2012-03-01

    We describe the potential of a novel method, based on Constraint Logic Programming (CLP), developed for an exhaustive sampling of protein conformational space. The CLP framework proposed here has been tested and applied to the estrogen receptor, whose activity and function is strictly related to its intrinsic, and well known, dynamics. We have investigated in particular the flexibility of H12, focusing on the pathways followed by the helix when moving from one stable crystallographic conformation to the others. Millions of geometrically feasible conformations were generated, selected and the traces connecting the different forms were determined by using a shortest path algorithm. The preliminary analyses showed a marked agreement between the crystallographic agonist-like, antagonist-like and hypothetical apo forms, and the corresponding conformations identified by the CLP framework. These promising results, together with the short computational time required to perform the analyses, make this constraint-based approach a valuable tool for the study of protein folding prediction. The CLP framework enables one to consider various structural and energetic scenarious, without changing the core algorithm. To show the feasibility of the method, we intentionally choose a pure geometric setting, neglecting the energetic evaluation of the poses, in order to be independent from a specific force field and to provide the possibility of comparing different behaviours associated with various energy models. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  19. Development of a conceptual framework of holistic risk assessment - Landfill as a particular type of contaminated land.

    PubMed

    Butt, T E; Javadi, A A; Nunns, M A; Beal, C D

    2016-11-01

    Landfills can be regarded as a particular type of contaminated land that has a potential to directly and indirectly pollute all of the four main spheres of the environment which are the lithosphere, atmosphere, hydrosphere and eventually adversely impact the biosphere. Therefore, environmental risk assessment of a landfill has to be more integrated and holistic by virtue of its nature of being a multidimensional pollutant source. Despite this, although various risk assessment approaches have been adopted for landfill waste disposal sites, there are still wide-ranging knowledge gaps and limitations which need to be addressed. One important knowledge gap and limitation of current risk assessment approaches is the inability to fully identify, categorise and aggregate all individual risks from all combinations of hazards, pathways and targets/receptors (e.g. water, air, soil and biota) in connection to a certain landfill leachate and yet at any stage of the landfill cycle. So such an approach is required that could not only integrate all possible characteristics of varying scenarios but also contain the ability to establish an overall risk picture, irrespective of the lifecycle stage of the landfill (e.g. planning stage/pre-operation, in-operation or post-operation/closed). One such approach to address the wide-breadth of landfill impact risks is by developing a more holistic risk assessment methodology, whose conceptual framework is presented in this paper for landfill leachate in a whole-system format. This conceptual framework does not only draw together various constituting factors and sub-factors of risk assessment in a logical sequence and categorical order, but also indicates the "what, why, when and how" outputs of and inputs to these factors and sub-factors can be useful. The framework is designed to identify and quantify a range of risks associated with all stages of the landfill lifecycle, and yet in a more streamlined, logical, categorical and integrated format, offering a more standardised and unified whole-system approach. Copyright © 2016. Published by Elsevier B.V.

  20. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  1. An Automated Design Framework for Multicellular Recombinase Logic.

    PubMed

    Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome

    2018-05-18

    Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.

  2. The New Quantum Logic

    NASA Astrophysics Data System (ADS)

    Griffiths, Robert B.

    2014-06-01

    It is shown how all the major conceptual difficulties of standard (textbook) quantum mechanics, including the two measurement problems and the (supposed) nonlocality that conflicts with special relativity, are resolved in the consistent or decoherent histories interpretation of quantum mechanics by using a modified form of quantum logic to discuss quantum properties (subspaces of the quantum Hilbert space), and treating quantum time development as a stochastic process. The histories approach in turn gives rise to some conceptual difficulties, in particular the correct choice of a framework (probabilistic sample space) or family of histories, and these are discussed. The central issue is that the principle of unicity, the idea that there is a unique single true description of the world, is incompatible with our current understanding of quantum mechanics.

  3. Certifying Domain-Specific Policies

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Pressburger, Thomas; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2001-01-01

    Proof-checking code for compliance to safety policies potentially enables a product-oriented approach to certain aspects of software certification. To date, previous research has focused on generic, low-level programming-language properties such as memory type safety. In this paper we consider proof-checking higher-level domain -specific properties for compliance to safety policies. The paper first describes a framework related to abstract interpretation in which compliance to a class of certification policies can be efficiently calculated Membership equational logic is shown to provide a rich logic for carrying out such calculations, including partiality, for certification. The architecture for a domain-specific certifier is described, followed by an implemented case study. The case study considers consistency of abstract variable attributes in code that performs geometric calculations in Aerospace systems.

  4. Fuzzy Traffic Control with Vehicle-to-Everything Communication.

    PubMed

    Salman, Muntaser A; Ozdemir, Suat; Celebi, Fatih V

    2018-01-27

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR.

  5. Fuzzy Traffic Control with Vehicle-to-Everything Communication

    PubMed Central

    Ozdemir, Suat; Celebi, Fatih V.

    2018-01-01

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR. PMID:29382053

  6. A novel logic-based approach for quantitative toxicology prediction.

    PubMed

    Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E

    2007-01-01

    There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.

  7. Eutrophication of lakes and reservoirs: A framework for making management decisions

    USGS Publications Warehouse

    Rast, W.; Holland, M.

    1988-01-01

    The development of management strategies for the protection of environmental quality usually involves consideration both of technical and nontechnical issues. A logical, step-by-step framework for development of such strategies is provided. Its application to the control of cultured eutrophication of lakes and reservoirs illustrates its potential usefulness. From the perspective of the policymaker, the main consideration is that the eutrophication-related water quality of a lake or reservoir can be managed for given water uses. The approach presented here allows the rational assessment of relevant water-quality parameters and establishment of water-quality goals, consideration of social and other nontechnical issues, the possibilities of public involvement in the decision-making process, and a reasonable economic analysis within a management framework.

  8. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    NASA Technical Reports Server (NTRS)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  9. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  10. Misconceived Relationships between Logical Positivism and Quantitative Research: An Analysis in the Framework of Ian Hacking.

    ERIC Educational Resources Information Center

    Yu, Chong Ho

    Although quantitative research methodology is widely applied by psychological researchers, there is a common misconception that quantitative research is based on logical positivism. This paper examines the relationship between quantitative research and eight major notions of logical positivism: (1) verification; (2) pro-observation; (3)…

  11. Theory! The Missing Link in Understanding the Performance of Neonate/Infant Home-Visiting Programs to Prevent Child Maltreatment: A Systematic Review

    PubMed Central

    Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim

    2012-01-01

    Context Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. Methods We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Findings Having a stated objective of reducing child maltreatment—a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change—considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Conclusions Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. PMID:22428693

  12. Theory! The missing link in understanding the performance of neonate/infant home-visiting programs to prevent child maltreatment: a systematic review.

    PubMed

    Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim

    2012-03-01

    Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Having a stated objective of reducing child maltreatment-a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change-considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. © 2012 Milbank Memorial Fund.

  13. An interval logic for higher-level temporal reasoning

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.

    1983-01-01

    Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.

  14. Collaborative Access Control For Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Baina, Amine; El Kalam, Anas Abou; Deswarte, Yves; Kaaniche, Mohamed

    A critical infrastructure (CI) can fail with various degrees of severity due to physical and logical vulnerabilities. Since many interdependencies exist between CIs, failures can have dramatic consequences on the entire infrastructure. This paper focuses on threats that affect information and communication systems that constitute the critical information infrastructure (CII). A new collaborative access control framework called PolyOrBAC is proposed to address security problems that are specific to CIIs. The framework offers each organization participating in a CII the ability to collaborate with other organizations while maintaining control of its resources and internal security policy. The approach is demonstrated on a practical scenario involving the electrical power grid.

  15. A framework for the selection and ensemble development of flood vulnerability models

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Schröter, Kai; Kreibich, Heidi; Martina, Mario

    2017-04-01

    Effective understanding and management of flood risk requires comprehensive risk assessment studies that consider not only the hazard component, but also the impacts that the phenomena may have on the built environment, economy and society. This integrated approach has gained importance over recent decades, and with it so has the scientific attention given to flood vulnerability models describing the relationships between flood intensity metrics and damage to physical assets, also known as flood loss models. Despite considerable progress in this field, many challenges persist. Flood damage mechanisms are complex and depend on multiple variables, which can have different degrees of importance depending on the application setting. In addition, data required for the development and validation of such models tend to be scarce, particularly in data poor regions. These issues are reflected in the large amount of flood vulnerability models that are available in the literature today, as well as in their high heterogeneity: they are built with different modelling approaches, in different geographic contexts, utilizing different explanatory variables, and with varying levels of complexity. Notwithstanding recent developments in this area, uncertainty remains high, and large disparities exist among models. For these reasons, identifying which model or models, given their properties, are appropriate for a given context is not straightforward. In the present study, we propose a framework that guides the structured selection of flood vulnerability models and enables ranking them according to their suitability for a certain application, based on expert judgement. The approach takes advantage of current state of the art and most up-to-date knowledge on flood vulnerability processes. Given the heterogeneity and uncertainty currently present in flood vulnerability models, we propose the use of a model ensemble. With this in mind, the proposed approach is based on a weighting scheme within a logic-tree framework that enables the generation of such ensembles in a logically consistent manner. We test and discuss the results by applying the framework to the case study of the 2002 floods along the Mulde River in Germany. Applications of individual models and model ensembles are compared and discussed.

  16. Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families

    DTIC Science & Technology

    2008-03-01

    of Defense, or the United States Government . AFIT/GCS/ENG/08-12 Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families...time, United States policy strongly encourages the sale and transfer of some military equipment to foreign governments and makes it easier for...Proceedings of the International Conference on Availability, Reliability and Security, 2007. 14. McDonald, J. Todd and Alec Yasinsac. “Of unicorns and random

  17. A Current Logical Framework: The Propositional Fragment

    DTIC Science & Technology

    2003-01-01

    Under the Curry- Howard isomorphism, M can also be read as a proof term, and A as a proposition of intuitionistic linear logic in its formulation as DILL...the obliga- tion to ensure that the underlying logic (via the Curry- Howard isomorphism, if you like) is sensible. In particular, the principles of...Proceedings of the International Logic Programming Symposium (ILPS󈨣), pages 51-65, Portland, Oregon, December 1995. MIT Press. 6. G. Bellin and P. J

  18. Retro-causation, Minimum Contradictions and Non-locality

    NASA Astrophysics Data System (ADS)

    Kafatos, Menas; Nassikas, Athanassios A.

    2011-11-01

    Retro-causation has been experimentally verified by Bem and proposed by Kafatos in the form of space-time non-locality in the quantum framework. Every theory includes, beyond its specific axioms, the principles of logical communication (logical language), through which it is defined. This communication obeys the Aristotelian logic (Classical Logic), the Leibniz Sufficient Reason Principle, and a hidden axiom, which basically states that there is anterior-posterior relationship everywhere in communication. By means of a theorem discussed here, it can be proved that the communication mentioned implies contradictory statements, which can only be transcended through silence, i.e. the absence of any statements. Moreover, the breaking of silence is meaningful through the claim for minimum contradictions, which implies the existence of both a logical and an illogical dimension; contradictions refer to causality, implying its opposite, namely retro-causation, and the anterior posterior axiom, implying space-time non-locality. The purpose of this paper is to outline a framework accounting for retro-causation, through both purely theoretical and reality based points of view.

  19. Three Sets of Case Studies Suggest Logic and Consistency Challenges with Value Frameworks.

    PubMed

    Cohen, Joshua T; Anderson, Jordan E; Neumann, Peter J

    2017-02-01

    To assess the logic and consistency of three prominent value frameworks. We reviewed the value frameworks from three organizations: the Memorial Sloan Kettering Cancer Center (DrugAbacus), the American Society of Clinical Oncologists, and the Institute for Clinical and Economic Review. For each framework, we developed case studies to explore the degree to which the frameworks have face validity in the sense that they are consistent with four important principles: value should be proportional to a therapy's benefit; components of value should matter to framework users (patients and payers); attribute weights should reflect user preferences; and value estimates used to inform therapy prices should reflect per-person benefit. All three frameworks can aid decision making by elucidating factors not explicitly addressed by conventional evaluation techniques (in particular, cost-effectiveness analyses). Our case studies identified four challenges: 1) value is not always proportional to benefit; 2) value reflects factors that may not be relevant to framework users (patients or payers); 3) attribute weights do not necessarily reflect user preferences or relate to value in ways that are transparent; and 4) value does not reflect per-person benefit. Although the value frameworks we reviewed capture value in a way that is important to various audiences, they are not always logical or consistent. Because these frameworks may have a growing influence on therapy access, it is imperative that analytic challenges be further explored. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. Medical concepts related to individual risk are better explained with "plausibility" rather than "probability".

    PubMed

    Grossi, Enzo

    2005-09-27

    The concept of risk has pervaded medical literature in the last decades and has become a familiar topic, and the concept of probability, linked to binary logic approach, is commonly applied in epidemiology and clinical medicine. The application of probability theory to groups of individuals is quite straightforward but can pose communication challenges at individual level. Few articles by the way have tried to focus the concept of "risk" at the individual subject level rather than at population level. The author has reviewed the conceptual framework which has led to the use of probability theory in the medical field in a time when the principal causes of death were represented by acute disease often of infective origin. In the present scenario, in which chronic degenerative disease dominate and there are smooth transitions between health and disease the use of fuzzy logic rather than binary logic would be more appropriate. The use of fuzzy logic in which more than two possible truth-value assignments are allowed overcomes the trap of probability theory when dealing with uncertain outcomes, thereby making the meaning of a certain prognostic statement easier to understand by the patient. At individual subject level the recourse to the term plausibility, related to fuzzy logic, would help the physician to communicate to the patient more efficiently in comparison with the term probability, related to binary logic. This would represent an evident advantage for the transfer of medical evidences to individual subjects.

  1. A Fuzzy Logic Approach to Marine Spatial Management

    NASA Astrophysics Data System (ADS)

    Teh, Lydia C. L.; Teh, Louise S. L.

    2011-04-01

    Marine spatial planning tends to prioritise biological conservation targets over socio-economic considerations, which may incur lower user compliance and ultimately compromise management success. We argue for more inclusion of human dimensions in spatial management, so that outcomes not only fulfill biodiversity and conservation objectives, but are also acceptable to resource users. We propose a fuzzy logic framework that will facilitate this task- The protected area suitability index (PASI) combines fishers' spatial preferences with biological criteria to assess site suitability for protection from fishing. We apply the PASI in a spatial evaluation of a small-scale reef fishery in Sabah, Malaysia. While our results pertain to fishers specifically, the PASI can also be customized to include the interests of other stakeholders and resource users, as well as incorporate varying levels of protection.

  2. Multidimensional Simulation Applied to Water Resources Management

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Ferreira, F. C.; Loucks, D. P.; Seixas, M. J.

    1990-09-01

    A framework for an integrated decision aiding simulation (IDEAS) methodology using numerical, linguistic, and pictorial entities and operations is introduced. IDEAS relies upon traditional numerical formulations, logical rules to handle linguistic entities with linguistic values, and a set of pictorial operations. Pictorial entities are defined by their shape, size, color, and position. Pictorial operators include reproduction (copy of a pictorial entity), mutation (expansion, rotation, translation, change in color), fertile encounters (intersection, reunion), and sterile encounters (absorption). Interaction between numerical, linguistic, and pictorial entities is handled through logical rules or a simplified vector calculus operation. This approach is shown to be applicable to various environmental and water resources management analyses using a model to assess the impacts of an oil spill. Future developments, including IDEAS implementation on parallel processing machines, are also discussed.

  3. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram

    PubMed Central

    2015-01-01

    Objectives Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. Methods This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. Results It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. Conclusions To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions. PMID:26618028

  4. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram.

    PubMed

    Chang, Hyejung

    2015-10-01

    Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions.

  5. Automating Access Control Logics in Simple Type Theory with LEO-II

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.

  6. A novel performance monitoring framework for health research systems: experiences of the National Institute for Health Research in England

    PubMed Central

    2011-01-01

    Background The National Institute for Health Research (NIHR) was established in 2006 with the aim of creating an applied health research system embedded within the English National Health Service (NHS). NIHR sought to implement an approach for monitoring its performance that effectively linked early indicators of performance with longer-term research impacts. We attempted to develop and apply a conceptual framework for defining appropriate key performance indicators for NIHR. Method Following a review of relevant literature, a conceptual framework for defining performance indicators for NIHR was developed, based on a hybridisation of the logic model and balanced scorecard approaches. This framework was validated through interviews with key NIHR stakeholders and a pilot in one division of NIHR, before being refined and applied more widely. Indicators were then selected and aggregated to create a basket of indicators aligned to NIHR's strategic goals, which could be reported to NIHR's leadership team on a quarterly basis via an oversight dashboard. Results Senior health research system managers and practitioners endorsed the conceptual framework developed and reported satisfaction with the breadth and balance of indicators selected for reporting. Conclusions The use of the hybrid conceptual framework provides a pragmatic approach to defining performance indicators that are aligned to the strategic aims of a health research system. The particular strength of this framework is its capacity to provide an empirical link, over time, between upstream activities of a health research system and its long-term strategic objectives. PMID:21435265

  7. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  8. About, for, in or through entrepreneurship in engineering education

    NASA Astrophysics Data System (ADS)

    Mäkimurto-Koivumaa, Soili; Belt, Pekka

    2016-09-01

    Engineering competences form a potential basis for entrepreneurship. There are pressures to find new approaches to entrepreneurship education (EE) in engineering education, as the traditional analytical logic of engineering does not match the modern view of entrepreneurship. Since the previous models do not give tangible enough tools on how to organise EE in practice, this article aims to develop a new framework for EE at the university level. We approach this aim by analysing existing scientific literature complemented by long-term practical observations, enabling a fruitful interplay between theory and practice. The developed framework recommends aspects in EE to be emphasised during each year of the study process. Action-based learning methods are highlighted in the beginning of studies to support students' personal growth. Explicit business knowledge is to be gradually increased only when professional, field-specific knowledge has been adequately accumulated.

  9. Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.

  10. Identifying and Mitigating Risks in Security Sector Assistance for Africa’s Fragile States

    DTIC Science & Technology

    2015-01-01

    The Logframe Handbook: A Logical Framework Approach to Project Cycle Management , Washington, D.C., 2005. 34 Identifying and Mitigating Risks in SSA...Fragile States 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER... Project Unique Identification Code (PUIC) for the project that produced this document is HQD126409. v Contents Preface

  11. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  12. Assessment of Evidence-based Management Training Program: Application of a Logic Model.

    PubMed

    Guo, Ruiling; Farnsworth, Tracy J; Hermanson, Patrick M

    2016-06-01

    The purposes of this study were to apply a logic model to plan and implement an evidence-based management (EBMgt) educational training program for healthcare administrators and to examine whether a logic model is a useful tool for evaluating the outcomes of the educational program. The logic model was used as a conceptual framework to guide the investigators in developing an EBMgt educational training program and evaluating the outcomes of the program. The major components of the logic model were constructed as inputs, outputs, and outcomes/impacts. The investigators delineated the logic model based on the results of the needs assessment survey. Two 3-hour training workshops were delivered to 30 participants. To assess the outcomes of the EBMgt educational program, pre- and post-tests and self-reflection surveys were conducted. The data were collected and analyzed descriptively and inferentially, using the IBM Statistical Package for the Social Sciences (SPSS) 22.0. A paired sample t-test was performed to compare the differences in participants' EBMgt knowledge and skills prior to and after the training. The assessment results showed that there was a statistically significant difference in participants' EBMgt knowledge and information searching skills before and after the training (p< 0.001). Participants' confidence in using the EBMgt approach for decision-making was significantly increased after the training workshops (p< 0.001). Eighty-three percent of participants indicated that the knowledge and skills they gained through the training program could be used for future management decision-making in their healthcare organizations. The overall evaluation results of the program were positive. It is suggested that the logic model is a useful tool for program planning, implementation, and evaluation, and it also improves the outcomes of the educational program.

  13. Cell-to-Cell Communication Circuits: Quantitative Analysis of Synthetic Logic Gates

    PubMed Central

    Hoffman-Sommer, Marta; Supady, Adriana; Klipp, Edda

    2012-01-01

    One of the goals in the field of synthetic biology is the construction of cellular computation devices that could function in a manner similar to electronic circuits. To this end, attempts are made to create biological systems that function as logic gates. In this work we present a theoretical quantitative analysis of a synthetic cellular logic-gates system, which has been implemented in cells of the yeast Saccharomyces cerevisiae (Regot et al., 2011). It exploits endogenous MAP kinase signaling pathways. The novelty of the system lies in the compartmentalization of the circuit where all basic logic gates are implemented in independent single cells that can then be cultured together to perform complex logic functions. We have constructed kinetic models of the multicellular IDENTITY, NOT, OR, and IMPLIES logic gates, using both deterministic and stochastic frameworks. All necessary model parameters are taken from literature or estimated based on published kinetic data, in such a way that the resulting models correctly capture important dynamic features of the included mitogen-activated protein kinase pathways. We analyze the models in terms of parameter sensitivity and we discuss possible ways of optimizing the system, e.g., by tuning the culture density. We apply a stochastic modeling approach, which simulates the behavior of whole populations of cells and allows us to investigate the noise generated in the system; we find that the gene expression units are the major sources of noise. Finally, the model is used for the design of system modifications: we show how the current system could be transformed to operate on three discrete values. PMID:22934039

  14. Learning with touchscreen devices: game strategies to improve geometric thinking

    NASA Astrophysics Data System (ADS)

    Soldano, Carlotta; Arzarello, Ferdinando

    2016-03-01

    The aim of this paper is to reflect on the importance of the students' game-strategic thinking during the development of mathematical activities. In particular, we hypothesise that this type of thinking helps students in the construction of logical links between concepts during the "argumentation phase" of the proving process. The theoretical background of our study lies in the works of J. Hintikka, a Finnish logician, who developed a new type of logic, based on game theory, called the logic of inquiry. In order to experiment with this new approach to the teaching and learning of mathematics, we have prepared five game-activities based on geometric theorems in which two players play against each other in a multi-touch dynamic geometric environment (DGE). In this paper, we present the design of the first game-activity and the relationship between it and the logic of inquiry. Then, adopting the theoretical framework of the instrumental genesis by Vérillon and Rabardel (EJPE 10: 77-101, 1995), we will present and analyse significant actions and dialogues developed by students while they are solving the game. We focus on the presence of a particular way of playing the game introduced by the students, the "reflected game", and highlight its functions for the development of the task.

  15. Approaching semantic interoperability in Health Level Seven

    PubMed Central

    Alschuler, Liora

    2010-01-01

    ‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995

  16. The Flow Engine Framework: A Cognitive Model of Optimal Human Experience

    PubMed Central

    Šimleša, Milija; Guegan, Jérôme; Blanchard, Edouard; Tarpin-Bernard, Franck; Buisine, Stéphanie

    2018-01-01

    Flow is a well-known concept in the fields of positive and applied psychology. Examination of a large body of flow literature suggests there is a need for a conceptual model rooted in a cognitive approach to explain how this psychological phenomenon works. In this paper, we propose the Flow Engine Framework, a theoretical model explaining dynamic interactions between rearranged flow components and fundamental cognitive processes. Using an IPO framework (Inputs – Processes – Outputs) including a feedback process, we organize flow characteristics into three logically related categories: inputs (requirements for flow), mediating and moderating cognitive processes (attentional and motivational mechanisms) and outputs (subjective and objective outcomes), describing the process of the flow. Comparing flow with an engine, inputs are depicted as flow-fuel, core processes cylinder strokes and outputs as power created to provide motion. PMID:29899807

  17. Fuzzy Hybrid Deliberative/Reactive Paradigm (FHDRP)

    NASA Technical Reports Server (NTRS)

    Sarmadi, Hengameth

    2004-01-01

    This work aims to introduce a new concept for incorporating fuzzy sets in hybrid deliberative/reactive paradigm. After a brief review on basic issues of hybrid paradigm the definition of agent-based fuzzy hybrid paradigm, which enables the agents to proceed and extract their behavior through quantitative numerical and qualitative knowledge and to impose their decision making procedure via fuzzy rule bank, is discussed. Next an example performs a more applied platform for the developed approach and finally an overview of the corresponding agents architecture enhances agents logical framework.

  18. Simulation Approach for Timing Analysis of Genetic Logic Circuits.

    PubMed

    Baig, Hasan; Madsen, Jan

    2017-07-21

    Constructing genetic logic circuits is an application of synthetic biology in which parts of the DNA of a living cell are engineered to perform a dedicated Boolean function triggered by an appropriate concentration of certain proteins or by different genetic components. These logic circuits work in a manner similar to electronic logic circuits, but they are much more stochastic and hence much harder to characterize. In this article, we introduce an approach to analyze the threshold value and timing of genetic logic circuits. We show how this approach can be used to analyze the timing behavior of single and cascaded genetic logic circuits. We further analyze the timing sensitivity of circuits by varying the degradation rates and concentrations. Our approach can be used not only to characterize the timing behavior but also to analyze the timing constraints of cascaded genetic logic circuits, a capability that we believe will be important for design automation in synthetic biology.

  19. Mathematics and morphogenesis of cities: A geometrical approach

    NASA Astrophysics Data System (ADS)

    Courtat, Thomas; Gloaguen, Catherine; Douady, Stephane

    2011-03-01

    Cities are living organisms. They are out of equilibrium, open systems that never stop developing and sometimes die. The local geography can be compared to a shell constraining its development. In brief, a city’s current layout is a step in a running morphogenesis process. Thus cities display a huge diversity of shapes and none of the traditional models, from random graphs, complex networks theory, or stochastic geometry, takes into account the geometrical, functional, and dynamical aspects of a city in the same framework. We present here a global mathematical model dedicated to cities that permits describing, manipulating, and explaining cities’ overall shape and layout of their street systems. This street-based framework conciliates the topological and geometrical sides of the problem. From the static analysis of several French towns (topology of first and second order, anisotropy, streets scaling) we make the hypothesis that the development of a city follows a logic of division or extension of space. We propose a dynamical model that mimics this logic and that, from simple general rules and a few parameters, succeeds in generating a large diversity of cities and in reproducing the general features the static analysis has pointed out.

  20. COMPOSE: Using temporal patterns for interpreting wearable sensor data with computer interpretable guidelines.

    PubMed

    Urovi, V; Jimenez-Del-Toro, O; Dubosson, F; Ruiz Torres, A; Schumacher, M I

    2017-02-01

    This paper describes a novel temporal logic-based framework for reasoning with continuous data collected from wearable sensors. The work is motivated by the Metabolic Syndrome, a cluster of conditions which are linked to obesity and unhealthy lifestyle. We assume that, by interpreting the physiological parameters of continuous monitoring, we can identify which patients have a higher risk of Metabolic Syndrome. We define temporal patterns for reasoning with continuous data and specify the coordination mechanisms for combining different sets of clinical guidelines that relate to this condition. The proposed solution is tested with data provided by twenty subjects, which used sensors for four days of continuous monitoring. The results are compared to the gold standard. The novelty of the framework stands in extending a temporal logic formalism, namely the Event Calculus, with temporal patterns. These patterns are helpful to specify the rules for reasoning with continuous data and in combining new knowledge into one consistent outcome that is tailored to the patient's profile. The overall approach opens new possibilities for delivering patient-tailored interventions and educational material before the patients present the symptoms of the disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. The engineering of cybernetic systems

    NASA Astrophysics Data System (ADS)

    Fry, Robert L.

    2002-05-01

    This tutorial develops a logical basis for the engineering of systems that operate cybernetically. The term cybernetic system has a clear quantitative definition. It is a system that dynamically matches acquired information to selected actions relative to a computational issue that defines the essential purpose of the system or machine. This notion requires that information and control be further quantified. The logic of questions and assertions as developed by Cox provides one means of doing this. The design and operation of cybernetic systems can be understood by contrasting these kinds of systems with communication systems and information theory as developed by Shannon. The joint logic of questions and assertions can be seen to underlie and be common to both information theory as applied to the design of discrete communication systems and to a theory of discrete general systems. The joint logic captures a natural complementarity between systems that transmit and receive information and those that acquire and act on it. Specific comparisons and contrasts are made between the source rate and channel capacity of a communication system and the acquisition rate and control capacity of a general system. An overview is provided of the joint logic of questions and assertions and the ties that this logic has to both conventional information theory and to a general theory of systems. I-diagrams, the interrogative complement of Venn diagrams, are described as providing valuable reasoning tools. An initial framework is suggested for the design of cybernetic systems. Two examples are given to illustrate this framework as applied to discrete cybernetic systems. These examples include a predator-prey problem as illustrated through "The Dog Chrysippus Pursuing its Prey," and the derivation of a single-neuron system that operates cybernetically and is biologically plausible. Future areas of research are highlighted which require development for a mature engineering framework.

  2. Using RUFDATA to guide a logic model for a quality assurance process in an undergraduate university program.

    PubMed

    Sherman, Paul David

    2016-04-01

    This article presents a framework to identify key mechanisms for developing a logic model blueprint that can be used for an impending comprehensive evaluation of an undergraduate degree program in a Canadian university. The evaluation is a requirement of a comprehensive quality assurance process mandated by the university. A modified RUFDATA (Saunders, 2000) evaluation model is applied as an initiating framework to assist in decision making to provide a guide for conceptualizing a logic model for the quality assurance process. This article will show how an educational evaluation is strengthened by employing a RUFDATA reflective process in exploring key elements of the evaluation process, and then translating this information into a logic model format that could serve to offer a more focussed pathway for the quality assurance activities. Using preliminary program evaluation data from two key stakeholders of the undergraduate program as well as an audit of the curriculum's course syllabi, a case is made for, (1) the importance of inclusivity of key stakeholders participation in the design of the evaluation process to enrich the authenticity and accuracy of program participants' feedback, and (2) the diversification of data collection methods to ensure that stakeholders' narrative feedback is given ample exposure. It is suggested that the modified RUFDATA/logic model framework be applied to all academic programs at the university undergoing the quality assurance process at the same time so that economies of scale may be realized. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Emulating the logic of monoterpenoid alkaloid biogenesis to access a skeletally diverse chemical library.

    PubMed

    Liu, Song; Scotti, John S; Kozmin, Sergey A

    2013-09-06

    We have developed a synthetic strategy that mimics the diversity-generating power of monoterpenoid indole alkaloid biosynthesis. Our general approach goes beyond diversification of a single natural product-like substructure and enables production of a highly diverse collection of small molecules. The reaction sequence begins with rapid and highly modular assembly of the tetracyclic indoloquinolizidine core, which can be chemoselectively processed into several additional skeletally diverse structural frameworks. The general utility of this approach was demonstrated by parallel synthesis of two representative chemical libraries containing 847 compounds with favorable physicochemical properties to enable its subsequent broad pharmacological evaluation.

  4. Fuzzy and process modelling of contour ridge water dynamics

    NASA Astrophysics Data System (ADS)

    Mhizha, Alexander; Ndiritu, John

    2018-05-01

    Contour ridges are an in-situ rainwater harvesting technology developed initially for soil erosion control but are currently also widely promoted for rainwater harvesting. The effectiveness of contour ridges depends on geophysical, hydro-climatic and socio economic factors that are highly varied in time and space. Furthermore, field-scale data on these factors are often unavailable. This together with the complexity of hydrological processes at field scale limits the application of classical distributed process modelling to highly-instrumented experimental fields. This paper presents a framework that combines fuzzy logic and process-based approach for modelling contour ridges for rainwater harvesting where detailed field data are not available. Water balance for a representative contour-ridged field incorporating the water flow processes across the boundaries is integrated with fuzzy logic to incorporate the uncertainties in estimating runoff. The model is tested using data collected during the 2009/2010 and 2010/2011 rainfall seasons from two contour-ridged fields in Zhulube located in the semi-arid parts of Zimbabwe. The model is found to replicate soil moisture in the root zone reasonably well (NSE = 0.55 to 0.66 and PBIAS = -1.3 to 6.1 %). The results show that combining fuzzy logic and process based approaches can adequately model soil moisture in a contour ridged-field and could help to assess the water dynamics in contour ridged fields.

  5. Using a systems orientation and foundational theory to enhance theory-driven human service program evaluations.

    PubMed

    Wasserman, Deborah L

    2010-05-01

    This paper offers a framework for using a systems orientation and "foundational theory" to enhance theory-driven evaluations and logic models. The framework guides the process of identifying and explaining operative relationships and perspectives within human service program systems. Self-Determination Theory exemplifies how a foundational theory can be used to support the framework in a wide range of program evaluations. Two examples illustrate how applications of the framework have improved the evaluators' abilities to observe and explain program effect. In both exemplars improvements involved addressing and organizing into a single logic model heretofore seemingly disparate evaluation issues regarding valuing (by whose values); the role of organizational and program context; and evaluation anxiety and utilization. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Fuzzy branching temporal logic.

    PubMed

    Moon, Seong-ick; Lee, Kwang H; Lee, Doheon

    2004-04-01

    Intelligent systems require a systematic way to represent and handle temporal information containing uncertainty. In particular, a logical framework is needed that can represent uncertain temporal information and its relationships with logical formulae. Fuzzy linear temporal logic (FLTL), a generalization of propositional linear temporal logic (PLTL) with fuzzy temporal events and fuzzy temporal states defined on a linear time model, was previously proposed for this purpose. However, many systems are best represented by branching time models in which each state can have more than one possible future path. In this paper, fuzzy branching temporal logic (FBTL) is proposed to address this problem. FBTL adopts and generalizes concurrent tree logic (CTL*), which is a classical branching temporal logic. The temporal model of FBTL is capable of representing fuzzy temporal events and fuzzy temporal states, and the order relation among them is represented as a directed graph. The utility of FBTL is demonstrated using a fuzzy job shop scheduling problem as an example.

  7. A New Approach to Teaching Mathematics

    DTIC Science & Technology

    1994-02-01

    We propose a new approach to teaching discrete math : First, teach logic as a powerful and versatile tool for discovering and communicating truths...using logic in other areas of study. Our experiences in teaching discrete math at Cornell shows that such success is possible. Propositional logic, Predicate logic, Discrete mathematics.

  8. Representative Agricultural Pathways: A Trans-Disciplinary Approach to Agricultural Model Inter-comparison, Improvement, Climate Impact Assessment and Stakeholder Engagement

    NASA Astrophysics Data System (ADS)

    Antle, J. M.; Valdivia, R. O.; Claessens, L.; Nelson, G. C.; Rosenzweig, C.; Ruane, A. C.; Vervoort, J.

    2013-12-01

    The global change research community has recognized that new pathway and scenario concepts are needed to implement impact and vulnerability assessment that is logically consistent across local, regional and global scales. For impact and vulnerability assessment, new socio-economic pathway and scenario concepts are being developed. Representative Agricultural Pathways (RAPs) are designed to extend global pathways to provide the detail needed for global and regional assessment of agricultural systems. In addition, research by the Agricultural Model Inter-comparison and Improvement Project (AgMIP) shows that RAPs provide a powerful way to engage stakeholders in climate-related research throughout the research process and in communication of research results. RAPs are based on the integrated assessment framework developed by AgMIP. This framework shows that both bio-physical and socio-economic drivers are essential components of agricultural pathways and logically precede the definition of adaptation and mitigation scenarios that embody associated capabilities and challenges. This approach is based on a trans-disciplinary process for designing pathways and then translating them into parameter sets for bio-physical and economic models that are components of agricultural integrated assessments of climate impact, adaptation and mitigation. RAPs must be designed to be part of a logically consistent set of drivers and outcomes from global to regional and local. Global RAPs are designed to be consistent with higher-level global socio-economic pathways, but add key agricultural drivers such as agricultural growth trends that are not specified in more general pathways, as illustrated in a recent inter-comparison of global agricultural models. To create pathways at regional or local scales, further detail is needed. At this level, teams of scientists and other experts with knowledge of the agricultural systems and regions work together through a step-wise process. Experiences from AgMIP Regional Teams, and from the project on Regional Approaches to Climate Change in the Pacific Northwest, are used to discuss how the RAPs procedures can be further developed and improved, and how RAPs can help engage stakeholders in climate-related research throughout the research process and in communication of research results.

  9. Eco-logical successes : third edition, September 2012

    DOT National Transportation Integrated Search

    2012-09-01

    Eco-Logical: An Ecosystem Approach to Developing Infrastructure Projects outlines an ecosystem-scale approach to prioritizing, developing, and delivering infrastructure projects. Eco-Logical emphasizes interagency collaboration in order to create inf...

  10. Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops

    NASA Astrophysics Data System (ADS)

    Rahman, Aminur; Jordan, Ian; Blackmore, Denis

    2018-01-01

    It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.

  11. Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops.

    PubMed

    Rahman, Aminur; Jordan, Ian; Blackmore, Denis

    2018-01-01

    It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.

  12. Teaching to the Test: A Pragmatic Approach to Teaching Logic

    ERIC Educational Resources Information Center

    Vannatta, Seth C.

    2014-01-01

    The proper goal of an introductory logic course, teaching critical thinking, is best achieved by maintaining the principle of continuity between student experiences and the curriculum. To demonstrate this I explain Dewey's naturalistic approach to logic and the process of inquiry, one which presents the elements of traditional logic in the…

  13. Enrollment Logics and Discourses: Toward Developing an Enrollment Knowledge Framework

    ERIC Educational Resources Information Center

    Snowden, Monique L.

    2013-01-01

    This article brings attention to a typology of enrollment knowledge possessed and enacted by contemporary chief enrollment officers. Interview narratives are used to reveal enrollment principles and associated actions--enrollment logics--that form enrollment discourses, which in turn shape the institutionalized presence of strategic enrollment…

  14. An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1993-01-01

    This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.

  15. GRADE Evidence to Decision (EtD) frameworks for adoption, adaptation, and de novo development of trustworthy recommendations: GRADE-ADOLOPMENT.

    PubMed

    Schünemann, Holger J; Wiercioch, Wojtek; Brozek, Jan; Etxeandia-Ikobaltzeta, Itziar; Mustafa, Reem A; Manja, Veena; Brignardello-Petersen, Romina; Neumann, Ignacio; Falavigna, Maicon; Alhazzani, Waleed; Santesso, Nancy; Zhang, Yuan; Meerpohl, Jörg J; Morgan, Rebecca L; Rochwerg, Bram; Darzi, Andrea; Rojas, Maria Ximenas; Carrasco-Labra, Alonso; Adi, Yaser; AlRayees, Zulfa; Riva, John; Bollig, Claudia; Moore, Ainsley; Yepes-Nuñez, Juan José; Cuello, Carlos; Waziry, Reem; Akl, Elie A

    2017-01-01

    Guideline developers can: (1) adopt existing recommendations from others; (2) adapt existing recommendations to their own context; or (3) create recommendations de novo. Monetary and nonmonetary resources, credibility, maximization of uptake, as well as logical arguments should guide the choice of the approach and processes. To describe a potentially efficient model for guideline production based on adoption, adaptation, and/or de novo development of recommendations utilizing the Grading of Recommendations Assessment, Development and Evaluation (GRADE) Evidence to Decision (EtD) frameworks. We applied the model in a new national guideline program producing 22 practice guidelines. We searched for relevant evidence that informs the direction and strength of a recommendation. We then produced GRADE EtDs for guideline panels to develop recommendations. We produced a total of 80 EtD frameworks in approximately 4 months and 146 EtDs in approximately 6 months in two waves. Use of the EtD frameworks allowed panel members understand judgments of others about the criteria that bear on guideline recommendations and then make their own judgments about those criteria in a systematic approach. The "GRADE-ADOLOPMENT" approach to guideline production combines adoption, adaptation, and, as needed, de novo development of recommendations. If developers of guidelines follow EtD criteria more widely and make their work publically available, this approach should prove even more useful. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  16. Ecoregions of the conterminous United States: evolution of a hierarchical spatial framework

    USGS Publications Warehouse

    Omernik, James M.; Griffith, Glenn E.

    2014-01-01

    A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.

  17. Ecoregions of the Conterminous United States: Evolution of a Hierarchical Spatial Framework

    NASA Astrophysics Data System (ADS)

    Omernik, James M.; Griffith, Glenn E.

    2014-12-01

    A map of ecological regions of the conterminous United States, first published in 1987, has been greatly refined and expanded into a hierarchical spatial framework in response to user needs, particularly by state resource management agencies. In collaboration with scientists and resource managers from numerous agencies and institutions in the United States, Mexico, and Canada, the framework has been expanded to cover North America, and the original ecoregions (now termed Level III) have been refined, subdivided, and aggregated to identify coarser as well as more detailed spatial units. The most generalized units (Level I) define 10 ecoregions in the conterminous U.S., while the finest-scale units (Level IV) identify 967 ecoregions. In this paper, we explain the logic underpinning the approach, discuss the evolution of the regional mapping process, and provide examples of how the ecoregions were distinguished at each hierarchical level. The variety of applications of the ecoregion framework illustrates its utility in resource assessment and management.

  18. Query Expansion and Query Translation as Logical Inference.

    ERIC Educational Resources Information Center

    Nie, Jian-Yun

    2003-01-01

    Examines query expansion during query translation in cross language information retrieval and develops a general framework for inferential information retrieval in two particular contexts: using fuzzy logic and probability theory. Obtains evaluation formulas that are shown to strongly correspond to those used in other information retrieval models.…

  19. Logical Demonomy Among the Ewe in West Africa

    ERIC Educational Resources Information Center

    Dzobo, N. K.

    1974-01-01

    Examines the indigenous pattern of moral behavior among the Ewe, an ethnic and linguistic group in West Africa, and assesses its role in moral education within the African context. The author develops a conceptual framework he calls "logical demonomy" that he uses to define the Ewe system of moral laws. (JT)

  20. Organizational Politics in Schools: Micro, Macro, and Logics of Action.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.; Mundell, Bryan L.

    1993-01-01

    Develops a framework for analyzing the politics of school organizations, affirming a Weberian perspective as most appropriate. Develops "logic of action" (the implicit relationship between means and goals) as the focal point of organizational politics. Underlines the importance of analyzing interest groups and their strategies. Political…

  1. A computational framework for prime implicants identification in noncoherent dynamic systems.

    PubMed

    Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico

    2015-01-01

    Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.

  2. A comparative analysis of centralized waiting lists for patients without a primary care provider implemented in six Canadian provinces: study protocol.

    PubMed

    Breton, Mylaine; Green, Michael; Kreindler, Sara; Sutherland, Jason; Jbilou, Jalila; Wong, Sabrina T; Shaw, Jay; Crooks, Valorie A; Contandriopoulos, Damien; Smithman, Mélanie Ann; Brousselle, Astrid

    2017-01-21

    Having a regular primary care provider (i.e., family physician or nurse practitioner) is widely considered to be a prerequisite for obtaining healthcare that is timely, accessible, continuous, comprehensive, and well-coordinated with other parts of the healthcare system. Yet, 4.6 million Canadians, approximately 15% of Canada's population, are unattached; that is, they do not have a regular primary care provider. To address the critical need for attachment, especially for more vulnerable patients, six Canadian provinces have implemented centralized waiting lists for unattached patients. These waiting lists centralize unattached patients' requests for a primary care provider in a given territory and match patients with providers. From the little information we have on each province's centralized waiting list, we know the way they work varies significantly from province to province. The main objective of this study is to compare the different models of centralized waiting lists for unattached patients implemented in six provinces of Canada to each other and to available scientific knowledge to make recommendations on ways to improve their design in an effort to increase attachment of patients to a primary care provider. A logic analysis approach developed in three steps will be used. Step 1: build logic models that describe each province's centralized waiting list through interviews with key stakeholders in each province; step 2: develop a conceptual framework, separate from the provincially informed logic models, that identifies key characteristics of centralized waiting lists for unattached patients and factors influencing their implementation through a literature review and interviews with experts; step 3: compare the logic models to the conceptual framework to make recommendations to improve centralized waiting lists in different provinces during a pan Canadian face-to-face exchange with decision-makers, clinicians and researchers. This study is based on an inter-provincial learning exchange approach where we propose to compare centralized waiting lists and analyze variations in strategies used to increase attachment to a regular primary care provider. Fostering inter-provincial healthcare systems connectivity to improve centralized waiting lists' practices across Canada can lever attachment to a regular provider for timely access to continuous, comprehensive and coordinated healthcare for all Canadians and particular for those who are vulnerable.

  3. Taming Data to Make Decisions: Using a Spatial Fuzzy Logic Decision Support Framework to Inform Conservation and Land Use Planning

    NASA Astrophysics Data System (ADS)

    Sheehan, T.; Baker, B.; Degagne, R. S.

    2015-12-01

    With the abundance of data sources, analytical methods, and computer models, land managers are faced with the overwhelming task of making sense of a profusion of data of wildly different types. Luckily, fuzzy logic provides a method to work with different types of data using language-based propositions such as "the landscape is undisturbed," and a simple set of logic constructs. Just as many surveys allow different levels of agreement with a proposition, fuzzy logic allows values reflecting different levels of truth for a proposition. Truth levels fall within a continuum ranging from Fully True to Fully False. Hence a fuzzy logic model produces continuous results. The Environmental Evaluation Modeling System (EEMS) is a platform-independent, tree-based, fuzzy logic modeling framework. An EEMS model provides a transparent definition of an evaluation model and is commonly developed as a collaborative effort among managers, scientists, and GIS experts. Managers specify a set of evaluative propositions used to characterize the landscape. Scientists, working with managers, formulate functions that convert raw data values into truth values for the propositions and produce a logic tree to combine results into a single metric used to guide decisions. Managers, scientists, and GIS experts then work together to implement and iteratively tune the logic model and produce final results. We present examples of two successful EEMS projects that provided managers with map-based results suitable for guiding decisions: sensitivity and climate change exposure in Utah and the Colorado Plateau modeled for the Bureau of Land Management; and terrestrial ecological intactness in the Mojave and Sonoran region of southern California modeled for the Desert Renewable Energy Conservation Plan.

  4. A framework for real-time distributed expert systems: On-orbit spacecraft fault diagnosis, monitoring and control

    NASA Technical Reports Server (NTRS)

    Mullikin, Richard L.

    1987-01-01

    Control of on-orbit operation of a spacecraft requires retention and application of special purpose, often unique, knowledge of equipment and procedures. Real-time distributed expert systems (RTDES) permit a modular approach to a complex application such as on-orbit spacecraft support. One aspect of a human-machine system that lends itself to the application of RTDES is the function of satellite/mission controllers - the next logical step toward the creation of truly autonomous spacecraft systems. This system application is described.

  5. Interaction Pattern Analysis in cMOOCs Based on the Connectivist Interaction and Engagement Framework

    ERIC Educational Resources Information Center

    Wang, Zhijun; Anderson, Terry; Chen, Li; Barbera, Elena

    2017-01-01

    Connectivist learning is interaction-centered learning. A framework describing interaction and cognitive engagement in connectivist learning was constructed using logical reasoning techniques. The framework and analysis was designed to help researchers and learning designers understand and adapt the characteristics and principles of interaction in…

  6. Querying quantitative logic models (Q2LM) to study intracellular signaling networks and cell-cytokine interactions.

    PubMed

    Morris, Melody K; Shriver, Zachary; Sasisekharan, Ram; Lauffenburger, Douglas A

    2012-03-01

    Mathematical models have substantially improved our ability to predict the response of a complex biological system to perturbation, but their use is typically limited by difficulties in specifying model topology and parameter values. Additionally, incorporating entities across different biological scales ranging from molecular to organismal in the same model is not trivial. Here, we present a framework called "querying quantitative logic models" (Q2LM) for building and asking questions of constrained fuzzy logic (cFL) models. cFL is a recently developed modeling formalism that uses logic gates to describe influences among entities, with transfer functions to describe quantitative dependencies. Q2LM does not rely on dedicated data to train the parameters of the transfer functions, and it permits straight-forward incorporation of entities at multiple biological scales. The Q2LM framework can be employed to ask questions such as: Which therapeutic perturbations accomplish a designated goal, and under what environmental conditions will these perturbations be effective? We demonstrate the utility of this framework for generating testable hypotheses in two examples: (i) a intracellular signaling network model; and (ii) a model for pharmacokinetics and pharmacodynamics of cell-cytokine interactions; in the latter, we validate hypotheses concerning molecular design of granulocyte colony stimulating factor. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  8. A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics

    NASA Astrophysics Data System (ADS)

    Haaf, Ezra; Heudorfer, Benedikt; Stahl, Kerstin; Barthel, Roland

    2017-04-01

    Fluctuation patterns in groundwater hydrographs are generally assumed to contain information on aquifer characteristics, climate and environmental controls. However, attempts to disentangle this information and map the dominant controls have been few. This is due to the substantial heterogeneity and complexity of groundwater systems, which is reflected in the abundance of morphologies of groundwater time series. To describe the structure and shape of hydrographs, descriptive terms like "slow"/ "fast" or "flashy"/ "inert" are frequently used, which are subjective, irreproducible and limited. This lack of objective and refined concepts limit approaches for regionalization of hydrogeological characteristics as well as our understanding of dominant processes controlling groundwater dynamics. Therefore, we propose a novel framework for groundwater hydrograph characterization in an attempt to categorize morphologies explicitly and quantitatively based on perceptual concepts of aspects of the dynamics. This quantitative framework is inspired by the existing and operational eco-hydrological classification frameworks for streamflow. The need for a new framework for groundwater systems is justified by the fundamental differences between the state variable groundwater head and the flow variable streamflow. Conceptually, we extracted exemplars of specific dynamic patterns, attributing descriptive terms for means of systematisation. Metrics, primarily taken from streamflow literature, were subsequently adapted to groundwater and assigned to the described patterns for means of quantification. In this study, we focused on the particularities of groundwater as a state variable. Furthermore, we investigated the descriptive skill of individual metrics as well as their usefulness for groundwater hydrographs. The ensemble of categorized metrics result in a framework, which can be used to describe and quantify groundwater dynamics. It is a promising tool for the setup of a successful similarity classification framework for groundwater hydrographs. However, the overabundance of metrics available calls for a systematic redundancy analysis of the metrics, which we describe in a second study (Heudorfer et al., 2017). Heudorfer, B., Haaf, E., Barthel, R., Stahl, K., 2017. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.

  9. Applying Toulmin: Teaching Logical Reasoning and Argumentative Writing

    ERIC Educational Resources Information Center

    Rex, Lesley A.; Thomas, Ebony Elizabeth; Engel, Steven

    2010-01-01

    To learn to write well-reasoned persuasive arguments, students need in situ help thinking through the complexity and complications of an issue, making inferences based on evidence, and hierarchically grouping and logically sequencing ideas. They rely on teachers to make this happen. In this article, the authors explain the framework they used and…

  10. Navigating a Mobile Robot Across Terrain Using Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Howard, Ayanna; Bon, Bruce

    2003-01-01

    A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.

  11. Evaluation of a Residential Mental Health Recovery Service in North Queensland.

    PubMed

    Heyeres, Marion; Kinchin, Irina; Whatley, Elise; Brophy, Lisa; Jago, Jon; Wintzloff, Thomas; Morton, Steve; Mosby, Vinitta; Gopalkrishnan, Narayan; Tsey, Komla

    2018-01-01

    Evidence shows that subacute mental health recovery occurs best when a person remains active within the community and fulfils meaningful and satisfying roles of their choosing. Several residential care services that incorporate these values have been established in Australia and overseas. This study describes (a) the development of an evaluation framework for a new subacute residential mental health recovery service in regional Australia and (b) reports on the formative evaluation outcomes. Continuous quality improvement and participatory research approaches informed all stages of the development of the evaluation framework. A program logic was established and subsequently tested for practicability. The resultant logic utilizes the Scottish Recovery Indicator 2 (SRI 2) service development tool, Individual Recovery Plans (IRPs), and the impact assessment of the service on psychiatric inpatient admissions (reported separately). Service strengths included a recovery-focused practice that identifies and addresses the basic needs of residents (consumers). The consumers of the service were encouraged to develop their own goals and self-manage their recovery plans. The staff of the service were identified as working effectively in the context of the recovery process; the staff were seen as supported and valued. Areas for improvement included more opportunities for self-management for residents and more feedback from residents and carers.

  12. On Cognition, Structured Sequence Processing, and Adaptive Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Petersson, Karl Magnus

    2008-11-01

    Cognitive neuroscience approaches the brain as a cognitive system: a system that functionally is conceptualized in terms of information processing. We outline some aspects of this concept and consider a physical system to be an information processing device when a subclass of its physical states can be viewed as representational/cognitive and transitions between these can be conceptualized as a process operating on these states by implementing operations on the corresponding representational structures. We identify a generic and fundamental problem in cognition: sequentially organized structured processing. Structured sequence processing provides the brain, in an essential sense, with its processing logic. In an approach addressing this problem, we illustrate how to integrate levels of analysis within a framework of adaptive dynamical systems. We note that the dynamical system framework lends itself to a description of asynchronous event-driven devices, which is likely to be important in cognition because the brain appears to be an asynchronous processing system. We use the human language faculty and natural language processing as a concrete example through out.

  13. A fuzzy MCDM framework based on fuzzy measure and fuzzy integral for agile supplier evaluation

    NASA Astrophysics Data System (ADS)

    Dursun, Mehtap

    2017-06-01

    Supply chains need to be agile in order to response quickly to the changes in today's competitive environment. The success of an agile supply chain depends on the firm's ability to select the most appropriate suppliers. This study proposes a multi-criteria decision making technique for conducting an analysis based on multi-level hierarchical structure and fuzzy logic for the evaluation of agile suppliers. The ideal and anti-ideal solutions are taken into consideration simultaneously in the developed approach. The proposed decision approach enables the decision-makers to use linguistic terms, and thus, reduce their cognitive burden in the evaluation process. Furthermore, a hierarchy of evaluation criteria and their related sub-criteria is employed in the presented approach in order to conduct a more effective analysis.

  14. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  15. Program Monitoring with LTL in EAGLE

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2004-01-01

    We briefly present a rule-based framework called EAGLE, shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics (MTL), interval logics, forms of quantified temporal logics, and so on. In this paper we focus on a linear temporal logic (LTL) specialization of EAGLE. For an initial formula of size m, we establish upper bounds of O(m(sup 2)2(sup m)log m) and O(m(sup 4)2(sup 2m)log(sup 2) m) for the space and time complexity, respectively, of single step evaluation over an input trace. This bound is close to the lower bound O(2(sup square root m) for future-time LTL presented. EAGLE has been successfully used, in both LTL and metric LTL forms, to test a real-time controller of an experimental NASA planetary rover.

  16. Fabrication of magnetic tunnel junctions connected through a continuous free layer to enable spin logic devices

    NASA Astrophysics Data System (ADS)

    Wan, Danny; Manfrini, Mauricio; Vaysset, Adrien; Souriau, Laurent; Wouters, Lennaert; Thiam, Arame; Raymenants, Eline; Sayan, Safak; Jussot, Julien; Swerts, Johan; Couet, Sebastien; Rassoul, Nouredine; Babaei Gavan, Khashayar; Paredis, Kristof; Huyghebaert, Cedric; Ercken, Monique; Wilson, Christopher J.; Mocuta, Dan; Radu, Iuliana P.

    2018-04-01

    Magnetic tunnel junctions (MTJs) interconnected via a continuous ferromagnetic free layer were fabricated for spin torque majority gate (STMG) logic. The MTJs are biased independently and show magnetoelectric response under spin transfer torque. The electrical control of these devices paves the way to future spin logic devices based on domain wall (DW) motion. In particular, it is a significant step towards the realization of a majority gate. To our knowledge, this is the first fabrication of a cross-shaped free layer shared by several perpendicular MTJs. The fabrication process can be generalized to any geometry and any number of MTJs. Thus, this framework can be applied to other spin logic concepts based on magnetic interconnect. Moreover, it allows exploration of spin dynamics for logic applications.

  17. Making It Logical: Implementation of Inclusive Education Using a Logic Model Framework

    ERIC Educational Resources Information Center

    Stegemann, Kim Calder; Jaciw, Andrew P.

    2018-01-01

    Educational inclusion of children with special learning needs is a philosophy and movement with an international presence. Though Canada is a leader in educational inclusion, many would claim that our public educational systems have not yet fully realized the dream of inclusive education. As other countries have noted, making full-fledged changes…

  18. Logical Metonymy Resolution in a Words-as-Cues Framework: Evidence from Self-Paced Reading and Probe Recognition

    ERIC Educational Resources Information Center

    Zarcone, Alessandra; Padó, Sebastian; Lenci, Alessandro

    2014-01-01

    Logical metonymy resolution ("begin a book" ? "begin reading a book" or "begin writing a book") has traditionally been explained either through complex lexical entries (qualia structures) or through the integration of the implicit event via post-lexical access to world knowledge. We propose that recent work within the…

  19. Methods of Product Evaluation. Guide Number 10. Evaluation Guides Series.

    ERIC Educational Resources Information Center

    St. John, Mark

    In this guide the logic of product evaluation is described in a framework that is meant to be general and adaptable to all kinds of evaluations. Evaluators should consider using the logic and methods of product evaluation when (1) the purpose of the evaluation is to aid evaluators in making a decision about purchases; (2) a comprehensive…

  20. The effectiveness of web-programming module based on scientific approach to train logical thinking ability for students in vocational high school

    NASA Astrophysics Data System (ADS)

    Nashiroh, Putri Khoirin; Kamdi, Waras; Elmunsyah, Hakkun

    2017-09-01

    Web programming is a basic subject in Computer and Informatics Engineering, a program study in a vocational high school. It requires logical thinking ability in its learning activities. The purposes of this research were (1) to develop a web programming module that implement scientific approach that can improve logical thinking ability for students in vocational high school; and (2) to test the effectiveness of web programming module based on scientific approach to train students' logical thinking ability. The results of this research was a web-programming module that apply scientific approach for learning activities to improve logical thinking ability of students in the vocational high school. The results of the effectiveness test of web-programming module give conclusion that it was very effective to train logical thinking ability and to improve learning result, this conclusion was supported by: (1) the average of posttest result of students exceeds the minimum criterion value, it was 79.91; (2) the average percentage of students' logical thinking score is 82,98; and (3) the average percentage of students' responses to the web programming module was 81.86%.

  1. The logic of counterfactual analysis in case-study explanation.

    PubMed

    Mahoney, James; Barrenechea, Rodrigo

    2017-12-19

    In this paper, we develop a set-theoretic and possible worlds approach to counterfactual analysis in case-study explanation. Using this approach, we first consider four kinds of counterfactuals: necessary condition counterfactuals, SUIN condition counterfactuals, sufficient condition counterfactuals, and INUS condition counterfactuals. We explore the distinctive causal claims entailed in each, and conclude that necessary condition and SUIN condition counterfactuals are the most useful types for hypothesis assessment in case-study research. We then turn attention to the development of a rigorous understanding of the 'minimal-rewrite' rule, linking this rule to insights from set theory about the relative importance of necessary conditions. We show why, logically speaking, a comparative analysis of two necessary condition counterfactuals will tend to favour small events and contingent happenings. A third section then presents new tools for specifying the level of generality of the events in a counterfactual. We show why and how the goals of formulating empirically important versus empirically plausible counterfactuals stand in tension with one another. Finally, we use our framework to link counterfactual analysis to causal sequences, which in turn provides advantages for conducting counterfactual projections. © London School of Economics and Political Science 2017.

  2. Applying differential dynamic logic to reconfigurable biological networks.

    PubMed

    Figueiredo, Daniel; Martins, Manuel A; Chaves, Madalena

    2017-09-01

    Qualitative and quantitative modeling frameworks are widely used for analysis of biological regulatory networks, the former giving a preliminary overview of the system's global dynamics and the latter providing more detailed solutions. Another approach is to model biological regulatory networks as hybrid systems, i.e., systems which can display both continuous and discrete dynamic behaviors. Actually, the development of synthetic biology has shown that this is a suitable way to think about biological systems, which can often be constructed as networks with discrete controllers, and present hybrid behaviors. In this paper we discuss this approach as a special case of the reconfigurability paradigm, well studied in Computer Science (CS). In CS there are well developed computational tools to reason about hybrid systems. We argue that it is worth applying such tools in a biological context. One interesting tool is differential dynamic logic (dL), which has recently been developed by Platzer and applied to many case-studies. In this paper we discuss some simple examples of biological regulatory networks to illustrate how dL can be used as an alternative, or also as a complement to methods already used. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.

  4. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  5. Treatment options following single-rooted tooth removal: a literature review and proposed hierarchy of treatment selection.

    PubMed

    Fugazzotto, Paul A

    2005-05-01

    Alveolar bone changes following tooth extraction have been well documented and have given rise to a number of treatment approaches. Included in these approaches are placement of various grafting materials, immediate implant placement, and a combination of both. A review of all pertinent literature discussing regenerative therapy at the time of tooth extraction or immediate implant placement with or without concomitant regenerative therapy was carried out. A clinically-based hierarchy of treatment selection following extraction of single rooted teeth is proposed, based upon the available literature and clinical experience. The role of patient phenotype is considered. Utilization of the proposed hierarchy of treatment selection affords a logical framework within which to predictably treat a variety of patients.

  6. A Little Logic Goes a Long Way: Basing Experiment on Semantic Theory in the Cognitive Science of Conditional Reasoning

    ERIC Educational Resources Information Center

    Stenning, Keith; van Lambalgen, Michiel

    2004-01-01

    Modern logic provides accounts of both interpretation and derivation which work together to provide abstract frameworks for modelling the sensitivity of human reasoning to task, context and content. Cognitive theories have underplayed the importance of interpretative processes. We illustrate, using Wason's [Q. J. Exp. Psychol. 20 (1968) 273]…

  7. Organized Cognition: Theoretical Framework for Future C2 Research and Implementation

    DTIC Science & Technology

    2011-06-01

    Dordrecht, The Netherlands: Kluwer Academic Publishers. 105. Husserl, E., Analyses Concerning Passive and Active Synthesis: Lectures on Transcendental...Logic. 2001, Dordrecht, The Netherlands: Kluwer Academic Publishers. 106. Merleau-Ponty, M., Phenomenology of Perception. 1962, London: Routledge...Secaucus, NJ: Kluwer Academic Publishers. 110. Husserl, E. and L. Landgrebe, Experience and Judgment: Investigations in a Genealogy of Logic. 1973

  8. On the Correct Formulation of the Law of the External Photoelectric Effect

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2017-01-01

    The critical and correct scientific analysis of the generally accepted theory of the external photoelectric effect is proposed. The methodological basis for the analysis is the unity of formal logic and of rational dialectics. It is shown that Einstein's formulation of the law of the photoelectric effect is not free from the following objection. The terms of Einstein's formula characterize the quantitative determinacy (i.e., energy) which belongs and is related to the different material objects: ``photon'', ``electron in metal'', and ``electron not in metal''. This signifies that Einstein's formula represents violation of the formal-logical laws of identity and absence (lack) of contradiction. The correct mathematical formulation of the law of the external photoelectric effect within the framework of the system approach is proposed. The correct formulation represents the proportion by relative increments of the energy of the incident photon and the energy of the emitted electron. The proportion describes the linear relationship between the energy of the incident photon and the energy of the emitted electron.

  9. Effective algorithm for solving complex problems of production control and of material flows control of industrial enterprise

    NASA Astrophysics Data System (ADS)

    Mezentsev, Yu A.; Baranova, N. V.

    2018-05-01

    A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.

  10. Second-order Cosmological Perturbations Engendered by Point-like Masses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brilenkov, Ruslan; Eingorn, Maxim, E-mail: ruslan.brilenkov@gmail.com, E-mail: maxim.eingorn@gmail.com

    2017-08-20

    In the ΛCDM framework, presenting nonrelativistic matter inhomogeneities as discrete massive particles, we develop the second‐order cosmological perturbation theory. Our approach relies on the weak gravitational field limit. The derived equations for the second‐order scalar, vector, and tensor metric corrections are suitable at arbitrary distances, including regions with nonlinear contrasts of the matter density. We thoroughly verify fulfillment of all Einstein equations, as well as self‐consistency of order assignments. In addition, we achieve logical positive results in the Minkowski background limit. Feasible investigations of the cosmological back-reaction manifestations by means of relativistic simulations are also outlined.

  11. Framework for a clinical information system.

    PubMed

    Van De Velde, R; Lansiers, R; Antonissen, G

    2002-01-01

    The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  12. Knowledge representation in fuzzy logic

    NASA Technical Reports Server (NTRS)

    Zadeh, Lotfi A.

    1989-01-01

    The author presents a summary of the basic concepts and techniques underlying the application of fuzzy logic to knowledge representation. He then describes a number of examples relating to its use as a computational system for dealing with uncertainty and imprecision in the context of knowledge, meaning, and inference. It is noted that one of the basic aims of fuzzy logic is to provide a computational framework for knowledge representation and inference in an environment of uncertainty and imprecision. In such environments, fuzzy logic is effective when the solutions need not be precise and/or it is acceptable for a conclusion to have a dispositional rather than categorical validity. The importance of fuzzy logic derives from the fact that there are many real-world applications which fit these conditions, especially in the realm of knowledge-based systems for decision-making and control.

  13. EAGLE Monitors by Collecting Facts and Generating Obligations

    NASA Technical Reports Server (NTRS)

    Barrnger, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. A monitor for an EAGLE formula checks if a finite trace of states satisfies the given formula. We present, in details, an algorithm for the synthesis of monitors for EAGLE. The algorithm is implemented as a Java application and involves novel techniques for rule definition, manipulation and execution. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace of states. Our initial experiments have been successful as EAGLE detected a previously unknown bug while testing a planetary rover controller.

  14. Probabilistic segmentation and intensity estimation for microarray images.

    PubMed

    Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro

    2006-01-01

    We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.

  15. New fundamental evidence of non-classical structure in the combination of natural concepts.

    PubMed

    Aerts, D; Sozzo, S; Veloz, T

    2016-01-13

    We recently performed cognitive experiments on conjunctions and negations of two concepts with the aim of investigating the combination problem of concepts. Our experiments confirmed the deviations (conceptual vagueness, underextension, overextension etc.) from the rules of classical (fuzzy) logic and probability theory observed by several scholars in concept theory, while our data were successfully modelled in a quantum-theoretic framework developed by ourselves. In this paper, we isolate a new, very stable and systematic pattern of violation of classicality that occurs in concept combinations. In addition, the strength and regularity of this non-classical effect leads us to believe that it occurs at a more fundamental level than the deviations observed up to now. It is our opinion that we have identified a deep non-classical mechanism determining not only how concepts are combined but, rather, how they are formed. We show that this effect can be faithfully modelled in a two-sector Fock space structure, and that it can be exactly explained by assuming that human thought is the superposition of two processes, a 'logical reasoning', guided by 'logic', and a 'conceptual reasoning', guided by 'emergence', and that the latter generally prevails over the former. All these findings provide new fundamental support to our quantum-theoretic approach to human cognition. © 2015 The Author(s).

  16. Fuzzy logic-based assessment for mapping potential infiltration areas in low-gradient watersheds.

    PubMed

    Quiroz Londoño, Orlando Mauricio; Romanelli, Asunción; Lima, María Lourdes; Massone, Héctor Enrique; Martínez, Daniel Emilio

    2016-07-01

    This paper gives an account of the design a logic-based approach for identifying potential infiltration areas in low-gradient watersheds based on remote sensing data. This methodological framework is applied in a sector of the Pampa Plain, Argentina, which has high level of agricultural activities and large demands for groundwater supplies. Potential infiltration sites are assessed as a function of two primary topics: hydrologic and soil conditions. This model shows the state of each evaluated subwatershed respecting to its potential contribution to infiltration mainly based on easily measurable and commonly used parameters: drainage density, geomorphologic units, soil media, land-cover, slope and aspect (slope orientation). Mapped outputs from the logic model displayed 42% very low-low, 16% moderate, 41% high-very high contribution to potential infiltration in the whole watershed. Subwatersheds in the upper and lower section were identified as areas with high to very high potential infiltration according to the following media features: low drainage density (<1.5 km/km(2)), arable land and pastures as the main land-cover categories, sandy clay loam to loam - clay loam soils and with the geomorphological units named poorly drained plain, channelized drainage plain and, dunes and beaches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. The role of integrated scientific approach facing the changing ocean policy. The case of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Vallega, Adalberto

    1999-08-01

    In the mid-1980s the debate about the role of oceanography vis-à-vis the evolving demand for ocean research was initiated in the framework of the Intergovernmental Oceanographic Commission (IOC) of UNESCO. That discussion was basically triggered by the need to meet the demand for research generated by the United Nations Conference on the Human Environment (1972). More recently, also as a consequence of the inputs from the United Nations Conference on Environment and Development (UNCED, 1992), the discussion of the role of oceanography in the framework of the co-operation between physical and social disciplines was initiated focusing on the prospect of building up the ocean science. The prospect of the ocean science as designed by IOC (1984) was finalised only to integrate the branches of oceanography. To discuss how that design could be implemented on the basis of progress in epistemology and ocean policy, the subject is focused on considering three levels: i) the epistemological level, where the option between positivism- and constructivism- based epistemologies has arisen; ii) the logical level, where the option is concerned with disjunctive and conjunctive logic; and iii) the methodological level, where the option regards the analytical-deductive and the inductive-axiomatic methods. The thesis is sustained that, to meet the demand for management-oriented research, the pathway including constructivist epistemology, conjunctive logic and inductive-axiomatic methods could be usefully adopted as the cement of inter-disciplinarity. The second part of the paper is concerned with the Mediterranean, and how holism-referred and management-aimed investigations might be conducted by applying the above conceptual approach is considered: i) presenting the individual emerging subject areas on which the demand for management patterns is expected to focus in the mid- and long-run; ii) illustrating the major aspects of the individual subject areas to be investigated; iii) deducing what leading role might be assigned to oceanography in building up inter-disciplinary approaches; iv) which disciplines might be stimulated to co-operate. The subject areas considered with reference to the Mediterranean include: i) integrated coastal management; ii) the deep-ocean coastal uses with special consideration of living resources management; iii) the protection of biodiversity. Ocean Geographical Information Systems (OGIS), data management, and education and training are presented as intersecting research areas calling for inter-disciplinary approaches. As a conclusion, a breakdown of questions on which discussion might be concentrated is considered.

  18. Informing the Gestalt: An Ethical Framework for Allocating Scarce Federal Public Health and Medical Resources to States During Disasters

    PubMed Central

    Knebel, Ann R.; Sharpe, Virginia A.; Danis, Marion; Toomey, Lauren M.; Knickerbocker, Deborah K.

    2017-01-01

    During catastrophic disasters, government leaders must decide how to efficiently and effectively allocate scarce public health and medical resources. The literature about triage decision making at the individual patient level is substantial, and the National Response Framework provides guidance about the distribution of responsibilities between federal and state governments. However, little has been written about the decision-making process of federal leaders in disaster situations when resources are not sufficient to meet the needs of several states simultaneously. We offer an ethical framework and logic model for decision making in such circumstances. We adapted medical triage and the federalism principle to the decision-making process for allocating scarce federal public health and medical resources. We believe that the logic model provides a values-based framework that can inform the gestalt during the iterative decision process used by federal leaders as they allocate scarce resources to states during catastrophic disasters. PMID:24612854

  19. Expanding a First-Order Logic Mitigation Framework to Handle Multimorbid Patient Preferences

    PubMed Central

    Michalowski, Martin; Wilk, Szymon; Rosu, Daniela; Kezadri, Mounira; Michalowski, Wojtek; Carrier, Marc

    2015-01-01

    The increasing prevalence of multimorbidity is a challenge for physicians who have to manage a constantly growing number of patients with simultaneous diseases. Adding to this challenge is the need to incorporate patient preferences as key components of the care process, thanks in part to the emergence of personalized and participatory medicine. In our previous work we proposed a framework employing first order logic to represent clinical practice guidelines (CPGs) and to mitigate possible adverse interactions when concurrently applying multiple CPGs to a multimorbid patient. In this paper, we describe extensions to our methodological framework that (1) broaden our definition of revision operators to support required and desired types of revisions defined in secondary knowledge sources, and (2) expand the mitigation algorithm to apply revisions based on their type. We illustrate the capabilities of the expanded framework using a clinical case study of a multimorbid patient with stable cardiac artery disease who suffers a sudden onset of deep vein thrombosis. PMID:26958226

  20. A Framework for Hierarchical Perception-Action Learning Utilizing Fuzzy Reasoning.

    PubMed

    Windridge, David; Felsberg, Michael; Shaukat, Affan

    2013-02-01

    Perception-action (P-A) learning is an approach to cognitive system building that seeks to reduce the complexity associated with conventional environment-representation/action-planning approaches. Instead, actions are directly mapped onto the perceptual transitions that they bring about, eliminating the need for intermediate representation and significantly reducing training requirements. We here set out a very general learning framework for cognitive systems in which online learning of the P-A mapping may be conducted within a symbolic processing context, so that complex contextual reasoning can influence the P-A mapping. In utilizing a variational calculus approach to define a suitable objective function, the P-A mapping can be treated as an online learning problem via gradient descent using partial derivatives. Our central theoretical result is to demonstrate top-down modulation of low-level perceptual confidences via the Jacobian of the higher levels of a subsumptive P-A hierarchy. Thus, the separation of the Jacobian as a multiplying factor between levels within the objective function naturally enables the integration of abstract symbolic manipulation in the form of fuzzy deductive logic into the P-A mapping learning. We experimentally demonstrate that the resulting framework achieves significantly better accuracy than using P-A learning without top-down modulation. We also demonstrate that it permits novel forms of context-dependent multilevel P-A mapping, applying the mechanism in the context of an intelligent driver assistance system.

  1. Causal learning and inference as a rational process: the new synthesis.

    PubMed

    Holyoak, Keith J; Cheng, Patricia W

    2011-01-01

    Over the past decade, an active line of research within the field of human causal learning and inference has converged on a general representational framework: causal models integrated with bayesian probabilistic inference. We describe this new synthesis, which views causal learning and inference as a fundamentally rational process, and review a sample of the empirical findings that support the causal framework over associative alternatives. Causal events, like all events in the distal world as opposed to our proximal perceptual input, are inherently unobservable. A central assumption of the causal approach is that humans (and potentially nonhuman animals) have been designed in such a way as to infer the most invariant causal relations for achieving their goals based on observed events. In contrast, the associative approach assumes that learners only acquire associations among important observed events, omitting the representation of the distal relations. By incorporating bayesian inference over distributions of causal strength and causal structures, along with noisy-logical (i.e., causal) functions for integrating the influences of multiple causes on a single effect, human judgments about causal strength and structure can be predicted accurately for relatively simple causal structures. Dynamic models of learning based on the causal framework can explain patterns of acquisition observed with serial presentation of contingency data and are consistent with available neuroimaging data. The approach has been extended to a diverse range of inductive tasks, including category-based and analogical inferences.

  2. An Assessment of Agency Theory as a Framework for the Government-University Relationship

    ERIC Educational Resources Information Center

    Kivisto, Jussi

    2008-01-01

    The aim of this paper is to use agency theory as the theoretical framework for an examination of the government-university relationship and to assess the main strengths and weaknesses of the theory in this context. Because of its logically consistent framework, agency theory is able to manifest many of the complexities and difficulties that…

  3. Improving the human readability of Arden Syntax medical logic modules using a concept-oriented terminology and object-oriented programming expressions.

    PubMed

    Choi, Jeeyae; Bakken, Suzanne; Lussier, Yves A; Mendonça, Eneida A

    2006-01-01

    Medical logic modules are a procedural representation for sharing task-specific knowledge for decision support systems. Based on the premise that clinicians may perceive object-oriented expressions as easier to read than procedural rules in Arden Syntax-based medical logic modules, we developed a method for improving the readability of medical logic modules. Two approaches were applied: exploiting the concept-oriented features of the Medical Entities Dictionary and building an executable Java program to replace Arden Syntax procedural expressions. The usability evaluation showed that 66% of participants successfully mapped all Arden Syntax rules to Java methods. These findings suggest that these approaches can play an essential role in the creation of human readable medical logic modules and can potentially increase the number of clinical experts who are able to participate in the creation of medical logic modules. Although our approaches are broadly applicable, we specifically discuss the relevance to concept-oriented nursing terminologies and automated processing of task-specific nursing knowledge.

  4. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  5. Integrating fuzzy logic, optimization, and GIS for ecological impact assessments.

    PubMed

    Bojórquez-Tapia, Luis A; Juárez, Lourdes; Cruz-Bello, Gustavo

    2002-09-01

    Appraisal of ecological impacts has been problematic because of the behavior of ecological system and the responses of these systems to human intervention are far from fully understood. While it has been relatively easy to itemize the potential ecological impacts, it has been difficult to arrive at accurate predictions of how these impacts affect populations, communities, or ecosystems. Furthermore, the spatial heterogeneity of ecological systems has been overlooked because its examination is practically impossible through matrix techniques, the most commonly used impact assessment approach. Besides, the public has become increasingly aware of the importance of the EIA in decision-making and thus the interpretation of impact significance is complicated further by the different value judgments of stakeholders. Moreover, impact assessments are carried out with a minimum of data, high uncertainty, and poor conceptual understanding. Hence, the evaluation of ecological impacts entails the integration of subjective and often conflicting judgments from a variety of experts and stakeholders. The purpose of this paper is to present an environmental impact assessment approach based on the integration fuzzy logic, geographical information systems and optimization techniques. This approach enables environmental analysts to deal with the intrinsic imprecision and ambiguity associated with the judgments of experts and stakeholders, the description of ecological systems, and the prediction of ecological impacts. The application of this approach is illustrated through an example, which shows how consensus about impact mitigation can be attained within a conflict resolution framework.

  6. Integrating Fuzzy Logic, Optimization, and GIS for Ecological Impact Assessments

    NASA Astrophysics Data System (ADS)

    Bojórquez-Tapia, Luis A.; Juárez, Lourdes; Cruz-Bello, Gustavo

    2002-09-01

    Appraisal of ecological impacts has been problematic because of the behavior of ecological system and the responses of these systems to human intervention are far from fully understood. While it has been relatively easy to itemize the potential ecological impacts, it has been difficult to arrive at accurate predictions of how these impacts affect populations, communities, or ecosystems. Furthermore, the spatial heterogeneity of ecological systems has been overlooked because its examination is practically impossible through matrix techniques, the most commonly used impact assessment approach. Besides, the public has become increasingly aware of the importance of the EIA in decision-making and thus the interpretation of impact significance is complicated further by the different value judgments of stakeholders. Moreover, impact assessments are carried out with a minimum of data, high uncertainty, and poor conceptual understanding. Hence, the evaluation of ecological impacts entails the integration of subjective and often conflicting judgments from a variety of experts and stakeholders. The purpose of this paper is to present an environmental impact assessment approach based on the integration fuzzy logic, geographical information systems and optimization techniques. This approach enables environmental analysts to deal with the intrinsic imprecision and ambiguity associated with the judgments of experts and stakeholders, the description of ecological systems, and the prediction of ecological impacts. The application of this approach is illustrated through an example, which shows how consensus about impact mitigation can be attained within a conflict resolution framework.

  7. Fallacies and fantasies: the theoretical underpinnings of the Coexistence Approach for palaeoclimate reconstruction

    NASA Astrophysics Data System (ADS)

    Grimm, G. W.; Potts, A. J.

    2015-12-01

    The Coexistence Approach has been used infer palaeoclimates for many Eurasian fossil plant assemblage. However, the theory that underpins the method has never been examined in detail. Here we discuss acknowledged and implicit assumptions, and assess the statistical nature and pseudo-logic of the method. We also compare the Coexistence Approach theory with the active field of species distribution modelling. We argue that the assumptions will inevitably be violated to some degree and that the method has no means to identify and quantify these violations. The lack of a statistical framework makes the method highly vulnerable to the vagaries of statistical outliers and exotic elements. In addition, we find numerous logical inconsistencies, such as how climate shifts are quantified (the use of a "center value" of a coexistence interval) and the ability to reconstruct "extinct" climates from modern plant distributions. Given the problems that have surfaced in species distribution modelling, accurate and precise quantitative reconstructions of palaeoclimates (or even climate shifts) using the nearest-living-relative principle and rectilinear niches (the basis of the method) will not be possible. The Coexistence Approach can be summarised as an exercise that shoe-horns a plant fossil assemblages into coexistence and then naively assumes that this must be the climate. Given the theoretical issues, and methodological issues highlighted elsewhere, we suggest that the method be discontinued and that all past reconstructions be disregarded and revisited using less fallacious methods.

  8. Semantic framework for mapping object-oriented model to semantic web languages

    PubMed Central

    Ježek, Petr; Mouček, Roman

    2015-01-01

    The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework. PMID:25762923

  9. Semantic framework for mapping object-oriented model to semantic web languages.

    PubMed

    Ježek, Petr; Mouček, Roman

    2015-01-01

    The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.

  10. Psychological first aid following trauma: implementation and evaluation framework for high-risk organizations.

    PubMed

    Forbes, David; Lewis, Virginia; Varker, Tracey; Phelps, Andrea; O'Donnell, Meaghan; Wade, Darryl J; Ruzek, Josef I; Watson, Patricia; Bryant, Richard A; Creamer, Mark

    2011-01-01

    International clinical practice guidelines for the management of psychological trauma recommend Psychological First Aid (PFA) as an early intervention for survivors of potentially traumatic events. These recommendations are consensus-based, and there is little published evidence assessing the effectiveness of PFA. This is not surprising given the nature of the intervention and the complicating factors involved in any evaluation of PFA. There is, nevertheless, an urgent need for stronger evidence evaluating its effectiveness. The current paper posits that the implementation and evaluation of PFA within high risk organizational settings is an ideal place to start. The paper provides a framework for a phasic approach to implementing PFA within such settings and presents a model for evaluating its effectiveness using a logic- or theory-based approach which considers both pre-event and post-event factors. Phases 1 and 2 of the PFA model are pre-event actions, and phases 3 and 4 are post-event actions. It is hoped that by using the Phased PFA model and evaluation method proposed in this paper, future researchers will begin to undertake the important task of building the evidence about the most effective approach to providing PFA in high risk organizational and community disaster settings.

  11. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy

    PubMed Central

    Knijnenburg, Theo A.; Klau, Gunnar W.; Iorio, Francesco; Garnett, Mathew J.; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F. A.

    2016-01-01

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present ‘Logic Optimization for Binary Input to Continuous Output’ (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models. PMID:27876821

  12. A motion-constraint logic for moving-base simulators based on variable filter parameters

    NASA Technical Reports Server (NTRS)

    Miller, G. K., Jr.

    1974-01-01

    A motion-constraint logic for moving-base simulators has been developed that is a modification to the linear second-order filters generally employed in conventional constraints. In the modified constraint logic, the filter parameters are not constant but vary with the instantaneous motion-base position to increase the constraint as the system approaches the positional limits. With the modified constraint logic, accelerations larger than originally expected are limited while conventional linear filters would result in automatic shutdown of the motion base. In addition, the modified washout logic has frequency-response characteristics that are an improvement over conventional linear filters with braking for low-frequency pilot inputs. During simulated landing approaches of an externally blown flap short take-off and landing (STOL) transport using decoupled longitudinal controls, the pilots were unable to detect much difference between the modified constraint logic and the logic based on linear filters with braking.

  13. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy.

    PubMed

    Knijnenburg, Theo A; Klau, Gunnar W; Iorio, Francesco; Garnett, Mathew J; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F A

    2016-11-23

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present 'Logic Optimization for Binary Input to Continuous Output' (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models.

  14. Depicting the logic of three evaluation theories.

    PubMed

    Hansen, Mark; Alkin, Marvin C; Wallace, Tanner Lebaron

    2013-06-01

    Here, we describe the development of logic models depicting three theories of evaluation practice: Practical Participatory (Cousins & Whitmore, 1998), Values-engaged (Greene, 2005a, 2005b), and Emergent Realist (Mark et al., 1998). We begin with a discussion of evaluation theory and the particular theories that were chosen for our analysis. We then outline the steps involved in constructing the models. The theoretical prescriptions and claims represented here follow a logic model template developed at the University Wisconsin-Extension (Taylor-Powell & Henert, 2008), which also closely aligns with Mark's (2008) framework for research on evaluation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    PubMed Central

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-01-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626

  16. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.

    PubMed

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-15

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  17. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  18. Logic. Geometry Module for Use in a Mathematics Laboratory Setting.

    ERIC Educational Resources Information Center

    Brotherton, Sheila; And Others

    Within this single module there are two approaches to this brief survey of logic. Since most geometry textbooks fail to give an adequate discussion of logic, a "textbook" treatment of the subject has been included. This is found as explanations interspersed in the exercises and these can be used as a textbook approach. However, also included is an…

  19. Improvements to Earthquake Location with a Fuzzy Logic Approach

    NASA Astrophysics Data System (ADS)

    Gökalp, Hüseyin

    2018-01-01

    In this study, improvements to the earthquake location method were investigated using a fuzzy logic approach proposed by Lin and Sanford (Bull Seismol Soc Am 91:82-93, 2001). The method has certain advantages compared to the inverse methods in terms of eliminating the uncertainties of arrival times and reading errors. In this study, adopting this approach, epicentral locations were determined based on the results of a fuzzy logic space concerning the uncertainties in the velocity models. To map the uncertainties in arrival times into the fuzzy logic space, a trapezoidal membership function was constructed by directly using the travel time difference between the two stations for the P- and S-arrival times instead of the P- and S-wave models to eliminate the need for obtaining information concerning the velocity structure of the study area. The results showed that this method worked most effectively when earthquakes occurred away from a network or when the arrival time data contained phase reading errors. In this study, to resolve the problems related to determining the epicentral locations of the events, a forward modeling method like the grid search technique was used by applying different logical operations (i.e., intersection, union, and their combination) with a fuzzy logic approach. The locations of the events were depended on results of fuzzy logic outputs in fuzzy logic space by searching in a gridded region. The process of location determination with the defuzzification of only the grid points with the membership value of 1 obtained by normalizing all the maximum fuzzy output values of the highest values resulted in more reliable epicentral locations for the earthquakes than the other approaches. In addition, throughout the process, the center-of-gravity method was used as a defuzzification operation.

  20. A Logic-Based Psychotherapy Approach to Treating Patients Which Focuses on Faultless Logical Functioning: A Case Study Method

    PubMed Central

    Almeida, Fernando; Moreira, Diana

    2017-01-01

    Many clinical patients present to mental health clinics with depressive symptoms, anxiety, psychosomatic complaints, and sleeping problems. These symptoms which originated may originate from marital problems, conflictual interpersonal relationships, problems in securing work, and housing issues, among many others. These issues might interfere which underlie the difficulties that with the ability of the patients face in maintaining faultless logical reasoning (FLR) and faultless logical functioning (FLF). FLR implies to assess correctly premises, rules, and conclusions. And FLF implies assessing not only FLR, but also the circumstances, life experience, personality, events that validate a conclusion. Almost always, the symptomatology is accompanied by intense emotional changes. Clinical experience shows that a logic-based psychotherapy (LBP) approach is not practiced, and that therapists’ resort to psychopharmacotherapy or other types of psychotherapeutic approaches that are not focused on logical reasoning and, especially, logical functioning. Because of this, patients do not learn to overcome their reasoning and functioning errors. The aim of this work was to investigate how LBP works to improve the patients’ ability to think and function in a faultless logical way. This work describes the case studies of three patients. For this purpose we described the treatment of three patients. With this psychotherapeutic approach, patients gain knowledge that can then be applied not only to the issues that led them to the consultation, but also to other problems they have experienced, thus creating a learning experience and helping to prevent such patients from becoming involved in similar problematic situations. This highlights that LBP is a way of treating symptoms that interfere on some level with daily functioning. This psychotherapeutic approach is relevant for improving patients’ quality of life, and it fills a gap in the literature by describing original case analyses. PMID:29312088

  1. Developing a monitoring and evaluation framework to integrate and formalize the informal waste and recycling sector: the case of the Philippine National Framework Plan.

    PubMed

    Serrona, Kevin Roy B; Yu, Jeongsoo; Aguinaldo, Emelita; Florece, Leonardo M

    2014-09-01

    The Philippines has been making inroads in solid waste management with the enactment and implementation of the Republic Act 9003 or the Ecological Waste Management Act of 2000. Said legislation has had tremendous influence in terms of how the national and local government units confront the challenges of waste management in urban and rural areas using the reduce, reuse, recycle and recovery framework or 4Rs. One of the sectors needing assistance is the informal waste sector whose aspiration is legal recognition of their rank and integration of their waste recovery activities in mainstream waste management. To realize this, the Philippine National Solid Waste Management Commission initiated the formulation of the National Framework Plan for the Informal Waste Sector, which stipulates approaches, strategies and methodologies to concretely involve the said sector in different spheres of local waste management, such as collection, recycling and disposal. What needs to be fleshed out is the monitoring and evaluation component in order to gauge qualitative and quantitative achievements vis-a-vis the Framework Plan. In the process of providing an enabling environment for the informal waste sector, progress has to be monitored and verified qualitatively and quantitatively and measured against activities, outputs, objectives and goals. Using the Framework Plan as the reference, this article developed monitoring and evaluation indicators using the logical framework approach in project management. The primary objective is to institutionalize monitoring and evaluation, not just in informal waste sector plans, but in any waste management initiatives to ensure that envisaged goals are achieved. © The Author(s) 2014.

  2. A conceptual review of decision making in social dilemmas: applying a logic of appropriateness.

    PubMed

    Weber, J Mark; Kopelman, Shirli; Messick, David M

    2004-01-01

    Despite decades of experimental social dilemma research, "theoretical integration has proven elusive" (Smithson & Foddy, 1999, p. 14). To advance a theory of decision making in social dilemmas, this article provides a conceptual review of the literature that applies a "logic of appropriateness" (March, 1994) framework. The appropriateness framework suggests that people making decisions ask themselves (explicitly or implicitly), "What does a person like me do in a situation like this? " This question identifies 3 significant factors: recognition and classification of the kind of situation encountered, the identity of the individual making the decision, and the application of rules or heuristics in guiding behavioral choice. In contrast with dominant rational choice models, the appropriateness framework proposed accommodates the inherently social nature of social dilemmas, and the role of rule and heuristic based processing. Implications for the interpretation of past findings and the direction of future research are discussed.

  3. A new approach of active compliance control via fuzzy logic control for multifingered robot hand

    NASA Astrophysics Data System (ADS)

    Jamil, M. F. A.; Jalani, J.; Ahmad, A.

    2016-07-01

    Safety is a vital issue in Human-Robot Interaction (HRI). In order to guarantee safety in HRI, a model reference impedance control can be a very useful approach introducing a compliant control. In particular, this paper establishes a fuzzy logic compliance control (i.e. active compliance control) to reduce impact and forces during physical interaction between humans/objects and robots. Exploiting a virtual mass-spring-damper system allows us to determine a desired compliant level by understanding the behavior of the model reference impedance control. The performance of fuzzy logic compliant control is tested in simulation for a robotic hand known as the RED Hand. The results show that the fuzzy logic is a feasible control approach, particularly to control position and to provide compliant control. In addition, the fuzzy logic control allows us to simplify the controller design process (i.e. avoid complex computation) when dealing with nonlinearities and uncertainties.

  4. Research on teacher education programs: logic model approach.

    PubMed

    Newton, Xiaoxia A; Poon, Rebecca C; Nunes, Nicole L; Stone, Elisa M

    2013-02-01

    Teacher education programs in the United States face increasing pressure to demonstrate their effectiveness through pupils' learning gains in classrooms where program graduates teach. The link between teacher candidates' learning in teacher education programs and pupils' learning in K-12 classrooms implicit in the policy discourse suggests a one-to-one correspondence. However, the logical steps leading from what teacher candidates have learned in their programs to what they are doing in classrooms that may contribute to their pupils' learning are anything but straightforward. In this paper, we argue that the logic model approach from scholarship on evaluation can enhance research on teacher education by making explicit the logical links between program processes and intended outcomes. We demonstrate the usefulness of the logic model approach through our own work on designing a longitudinal study that focuses on examining the process and impact of an undergraduate mathematics and science teacher education program. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. The Illness Narratives of Health Managers: Developing an Analytical Framework

    ERIC Educational Resources Information Center

    Exworthy, Mark

    2011-01-01

    This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…

  6. Framework for Transforming Departmental Culture to Support Educational Innovation

    ERIC Educational Resources Information Center

    Corbo, Joel C.; Reinholz, Daniel L.; Dancy, Melissa H.; Deetz, Stanley; Finkelstein, Noah

    2016-01-01

    This paper provides a research-based framework for promoting institutional change in higher education. To date, most educational change efforts have focused on relatively narrow subsets of the university system (e.g., faculty teaching practices or administrative policies) and have been largely driven by implicit change logics; both of these…

  7. Intelligent microchip networks: an agent-on-chip synthesis framework for the design of smart and robust sensor networks

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2013-05-01

    Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.

  8. Interpretation of IEEE-854 floating-point standard and definition in the HOL system

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.

    1995-01-01

    The ANSI/IEEE Standard 854-1987 for floating-point arithmetic is interpreted by converting the lexical descriptions in the standard into mathematical conditional descriptions organized in tables. The standard is represented in higher-order logic within the framework of the HOL (Higher Order Logic) system. The paper is divided in two parts with the first part the interpretation and the second part the description in HOL.

  9. Agent Based Modeling and Simulation Framework for Supply Chain Risk Management

    DTIC Science & Technology

    2012-03-01

    Christopher and Peck 2004) macroeconomic , policy, competition, and resource (Ghoshal 1987) value chain, operational, event, and recurring (Shi 2004...clustering algorithms in agent logic to protect company privacy ( da Silva et al. 2006), aggregation of domain context in agent data analysis logic (Xiang...Operational Availability ( OA ) for FMC and PMC. 75 Mission Capable (MICAP) Hours is the measure of total time (in a month) consumable or reparable

  10. Guidance for modeling causes and effects in environmental problem solving

    USGS Publications Warehouse

    Armour, Carl L.; Williamson, Samuel C.

    1988-01-01

    Environmental problems are difficult to solve because their causes and effects are not easily understood. When attempts are made to analyze causes and effects, the principal challenge is organization of information into a framework that is logical, technically defensible, and easy to understand and communicate. When decisionmakers attempt to solve complex problems before an adequate cause and effect analysis is performed there are serious risks. These risks include: greater reliance on subjective reasoning, lessened chance for scoping an effective problem solving approach, impaired recognition of the need for supplemental information to attain understanding, increased chance for making unsound decisions, and lessened chance for gaining approval and financial support for a program/ Cause and effect relationships can be modeled. This type of modeling has been applied to various environmental problems, including cumulative impact assessment (Dames and Moore 1981; Meehan and Weber 1985; Williamson et al. 1987; Raley et al. 1988) and evaluation of effects of quarrying (Sheate 1986). This guidance for field users was written because of the current interest in documenting cause-effect logic as a part of ecological problem solving. Principal literature sources relating to the modeling approach are: Riggs and Inouye (1975a, b), Erickson (1981), and United States Office of Personnel Management (1986).

  11. A systematic grounded approach to the development of complex interventions: the Australian WorkHealth Program--arthritis as a case study.

    PubMed

    Reavley, Nicola; Livingston, Jenni; Buchbinder, Rachelle; Bennell, Kim; Stecki, Chris; Osborne, Richard Harry

    2010-02-01

    Despite demands for evidence-based research and practice, little attention has been given to systematic approaches to the development of complex interventions to tackle workplace health problems. This paper outlines an approach to the initial stages of a workplace program development which integrates health promotion and disease management. The approach commences with systematic and genuine processes of obtaining information from key stakeholders with broad experience of these interventions. This information is constructed into a program framework in which practice-based and research-informed elements are both valued. We used this approach to develop a workplace education program to reduce the onset and impact of a common chronic disease - osteoarthritis. To gain information systematically at a national level, a structured concept mapping workshop with 47 participants from across Australia was undertaken. Participants were selected to maximise the whole-of-workplace perspective and included health education providers, academics, clinicians and policymakers. Participants generated statements in response to a seeding statement: Thinking as broadly as possible, what changes in education and support should occur in the workplace to help in the prevention and management of arthritis? Participants grouped the resulting statements into conceptually coherent groups and a computer program was used to generate a 'cluster map' along with a list of statements sorted according to cluster membership. In combination with research-based evidence, the concept map informed the development of a program logic model incorporating the program's guiding principles, possible service providers, services, training modes, program elements and the causal processes by which participants might benefit. The program logic model components were further validated through research findings from diverse fields, including health education, coaching, organisational learning, workplace interventions, workforce development and osteoarthritis disability prevention. In summary, wide and genuine consultation, concept mapping, and evidence-based program logic development were integrated to develop a whole-of-system complex intervention in which potential effectiveness and assimilation into the workplace for which optimised. Copyright 2009 Elsevier Ltd. All rights reserved.

  12. A novel way of integrating rule-based knowledge into a web ontology language framework.

    PubMed

    Gamberger, Dragan; Krstaçić, Goran; Jović, Alan

    2013-01-01

    Web ontology language (OWL), used in combination with the Protégé visual interface, is a modern standard for development and maintenance of ontologies and a powerful tool for knowledge presentation. In this work, we describe a novel possibility to use OWL also for the conceptualization of knowledge presented by a set of rules. In this approach, rules are represented as a hierarchy of actionable classes with necessary and sufficient conditions defined by the description logic formalism. The advantages are that: the set of the rules is not an unordered set anymore, the concepts defined in descriptive ontologies can be used directly in the bodies of rules, and Protégé presents an intuitive tool for editing the set of rules. Standard ontology reasoning processes are not applicable in this framework, but experiments conducted on the rule sets have demonstrated that the reasoning problems can be successfully solved.

  13. Cell Fate Reprogramming by Control of Intracellular Network Dynamics

    PubMed Central

    Zañudo, Jorge G. T.; Albert, Réka

    2015-01-01

    Identifying control strategies for biological networks is paramount for practical applications that involve reprogramming a cell’s fate, such as disease therapeutics and stem cell reprogramming. Here we develop a novel network control framework that integrates the structural and functional information available for intracellular networks to predict control targets. Formulated in a logical dynamic scheme, our approach drives any initial state to the target state with 100% effectiveness and needs to be applied only transiently for the network to reach and stay in the desired state. We illustrate our method’s potential to find intervention targets for cancer treatment and cell differentiation by applying it to a leukemia signaling network and to the network controlling the differentiation of helper T cells. We find that the predicted control targets are effective in a broad dynamic framework. Moreover, several of the predicted interventions are supported by experiments. PMID:25849586

  14. Cyberspatial mechanics.

    PubMed

    Bayne, Jay S

    2008-06-01

    In support of a generalization of systems theory, this paper introduces a new approach in modeling complex distributed systems. It offers an analytic framework for describing the behavior of interactive cyberphysical systems (CPSs), which are networked stationary or mobile information systems responsible for the real-time governance of physical processes whose behaviors unfold in cyberspace. The framework is predicated on a cyberspace-time reference model comprising three spatial dimensions plus time. The spatial domains include geospatial, infospatial, and sociospatial references, the latter describing relationships among sovereign enterprises (rational agents) that choose voluntarily to organize and interoperate for individual and mutual benefit through geospatial (physical) and infospatial (logical) transactions. Of particular relevance to CPSs are notions of timeliness and value, particularly as they relate to the real-time governance of physical processes and engagements with other cooperating CPS. Our overarching interest, as with celestial mechanics, is in the formation and evolution of clusters of cyberspatial objects and the federated systems they form.

  15. Interventions developed with the Intervention Mapping protocol in the field of cancer: A systematic review.

    PubMed

    Lamort-Bouché, Marion; Sarnin, Philippe; Kok, Gerjo; Rouat, Sabrina; Péron, Julien; Letrilliart, Laurent; Fassier, Jean-Baptiste

    2018-04-01

    The Intervention Mapping (IM) protocol provides a structured framework to develop, implement, and evaluate complex interventions. The main objective of this review was to identify and describe the content of the interventions developed in the field of cancer with the IM protocol. Secondary objectives were to assess their fidelity to the IM protocol and to review their theoretical frameworks. Medline, Web of Science, PsycINFO, PASCAL, FRANCIS, and BDSP databases were searched. All titles and abstracts were reviewed. A standardized extraction form was developed. All included studies were reviewed by 2 reviewers blinded to each other. Sixteen studies were identified, and these reported 15 interventions. The objectives were to increase cancer screening participation (n = 7), early consultation (n = 1), and aftercare/quality of life among cancer survivors (n = 7). Six reported a complete participatory planning group, and 7 described a complete logic model of the problem. Ten studies described a complete logic model of change. The main theoretical frameworks used were the theory of planned behaviour (n = 8), the transtheoretical model (n = 6), the health belief model (n = 6), and the social cognitive theory (n = 6). The environment was rarely integrated in the interventions (n = 4). Five interventions were reported as effective. Culturally relevant interventions were developed with the IM protocol that were effective to increase cancer screening and reduce social disparities, particularly when they were developed through a participative approach and integrated the environment. Stakeholders' involvement and the role of the environment were heterogeneously integrated in the interventions. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Cybernetic systems based on inductive logic

    NASA Astrophysics Data System (ADS)

    Fry, Robert L.

    2001-05-01

    Recent work in the area of inductive logic suggests that cybernetics might be quantified and reduced to engineering practice. If so, then there are considerable implications for engineering, science, and other fields. This paper attempts to capture the essential ideas of cybernetics cast in the light of inductive logic. The described inductive logic extends conventional logic by adding a conjugate logical domain of questions to the logical domain of assertions intrinsic to Boolean Algebra with which most are familiar. This was first posited and developed by Richard Cox. Interestingly enough, these two logical domains, one of questions and the other of assertions, only exist relative to one another with each possessing natural measures of entropy and probability, respectively. Examples are given that highlight the utility of cybernetic approaches to neuroscience, algorithm design, system engineering, and the design and understanding of defensive and offensive systems. For example, the application of cybernetic approaches to defense systems suggests that these systems possess a wavefunction which like quantum mechanics, collapses when we ``look'' through the eyes of the system sensors such as radars and optical sensors. .

  17. A Data Model Framework for the Characterization of a Satellite Data Handling Software

    NASA Astrophysics Data System (ADS)

    Camatto, Gianluigi; Tipaldi, Massimo; Bothmer, Wolfgang; Ferraguto, Massimo; Bruenjes, Bernhard

    2014-08-01

    This paper describes an approach for the modelling of the characterization and configuration data yielded when developing a Satellite Data Handling Software (DHSW). The model can then be used as an input for the preparation of the logical and physical representation of the Satellite Reference Database (SRDB) contents and related SW suite, an essential product that allows transferring the information between the different system stakeholders, but also to produce part of the DHSW documentation and artefacts. Special attention is given to the shaping of the general Parameter concept, which is shared by a number of different entities within a Space System.

  18. The outcome competency framework for practitioners in infection prevention and control: use of the outcome logic model for evaluation.

    PubMed

    Burnett, E; Curran, E; Loveday, H P; Kiernan, M A; Tannahill, M

    2014-01-01

    Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences.

  19. The outcome competency framework for practitioners in infection prevention and control: use of the outcome logic model for evaluation

    PubMed Central

    Curran, E; Loveday, HP; Kiernan, MA; Tannahill, M

    2013-01-01

    Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences. PMID:28989348

  20. Intelligent manipulation technique for multi-branch robotic systems

    NASA Technical Reports Server (NTRS)

    Chen, Alexander Y. K.; Chen, Eugene Y. S.

    1990-01-01

    New analytical development in kinematics planning is reported. The INtelligent KInematics Planner (INKIP) consists of the kinematics spline theory and the adaptive logic annealing process. Also, a novel framework of robot learning mechanism is introduced. The FUzzy LOgic Self Organized Neural Networks (FULOSONN) integrates fuzzy logic in commands, control, searching, and reasoning, the embedded expert system for nominal robotics knowledge implementation, and the self organized neural networks for the dynamic knowledge evolutionary process. Progress on the mechanical construction of SRA Advanced Robotic System (SRAARS) and the real time robot vision system is also reported. A decision was made to incorporate the Local Area Network (LAN) technology in the overall communication system.

  1. Cognitive pathways and historical research.

    PubMed

    Sutherland, J A

    1997-01-01

    The nursing literature is replete with articles detailing the logical reasoning processes required by the individual scientist to implement the rigors of research and theory development. Much less attention has been focused on creative and critical thinking as modes for deriving explanations, inferences, and conclusions essential to science as a product. Historical research, as a particular kind of qualitative research, is dependent on and compatible with such mental strategies as logical, creative, and critical thinking. These strategies depict an intellectual framework for the scientist examining archival data and offer a structure for such inquiry. A model for analyzing historical data delineating the cognitive pathways of logical reasoning, creative processing, and critical thinking is proposed.

  2. Neural networks and logical reasoning systems: a translation table.

    PubMed

    Martins, J; Mendes, R V

    2001-04-01

    A correspondence is established between the basic elements of logic reasoning systems (knowledge bases, rules, inference and queries) and the structure and dynamical evolution laws of neural networks. The correspondence is pictured as a translation dictionary which might allow to go back and forth between symbolic and network formulations, a desirable step in learning-oriented systems and multicomputer networks. In the framework of Horn clause logics, it is found that atomic propositions with n arguments correspond to nodes with nth order synapses, rules to synaptic intensity constraints, forward chaining to synaptic dynamics and queries either to simple node activation or to a query tensor dynamics.

  3. Reasoning on Weighted Delegatable Authorizations

    NASA Astrophysics Data System (ADS)

    Ruan, Chun; Varadharajan, Vijay

    This paper studies logic based methods for representing and evaluating complex access control policies needed by modern database applications. In our framework, authorization and delegation rules are specified in a Weighted Delegatable Authorization Program (WDAP) which is an extended logic program. We show how extended logic programs can be used to specify complex security policies which support weighted administrative privilege delegation, weighted positive and negative authorizations, and weighted authorization propagations. We also propose a conflict resolution method that enables flexible delegation control by considering priorities of authorization grantors and weights of authorizations. A number of rules are provided to achieve delegation depth control, conflict resolution, and authorization and delegation propagations.

  4. An Institutional Perspective on Accountable Care Organizations.

    PubMed

    Goodrick, Elizabeth; Reay, Trish

    2016-12-01

    We employ aspects of institutional theory to explore how Accountable Care Organizations (ACOs) can effectively manage the multiplicity of ideas and pressures within which they are embedded and consequently better serve patients and their communities. More specifically, we draw on the concept of institutional logics to highlight the importance of understanding the conflicting principles upon which ACOs were founded. Based on previous research conducted both inside and outside health care settings, we argue that ACOs can combine attention to these principles (or institutional logics) in different ways; the options fall on a continuum from (a) segregating the effects of multiple logics from each other by compartmentalizing responses to multiple logics to (b) fully hybridizing the different logics. We suggest that the most productive path for ACOs is to situate their approach between the two extremes of "segregating" and "fully hybridizing." This strategic approach allows ACOs to develop effective responses that combine logics without fully integrating them. We identify three ways that ACOs can embrace institutional complexity short of fully hybridizing disparate logics: (1) reinterpreting practices to make them compatible with other logics; (2) engaging in strategies that take advantage of existing synergy between conflicting logics; (3) creating opportunities for people at frontline to develop innovative ways of working that combine multiple logics. © The Author(s) 2016.

  5. Can composite digital monitoring biomarkers come of age? A framework for utilization.

    PubMed

    Kovalchick, Christopher; Sirkar, Rhea; Regele, Oliver B; Kourtis, Lampros C; Schiller, Marie; Wolpert, Howard; Alden, Rhett G; Jones, Graham B; Wright, Justin M

    2017-12-01

    The application of digital monitoring biomarkers in health, wellness and disease management is reviewed. Harnessing the near limitless capacity of these approaches in the managed healthcare continuum will benefit from a systems-based architecture which presents data quality, quantity, and ease of capture within a decision-making dashboard. A framework was developed which stratifies key components and advances the concept of contextualized biomarkers. The framework codifies how direct, indirect, composite, and contextualized composite data can drive innovation for the application of digital biomarkers in healthcare. The de novo framework implies consideration of physiological, behavioral, and environmental factors in the context of biomarker capture and analysis. Application in disease and wellness is highlighted, and incorporation in clinical feedback loops and closed-loop systems is illustrated. The study of contextualized biomarkers has the potential to offer rich and insightful data for clinical decision making. Moreover, advancement of the field will benefit from innovation at the intersection of medicine, engineering, and science. Technological developments in this dynamic field will thus fuel its logical evolution guided by inputs from patients, physicians, healthcare providers, end-payors, actuarists, medical device manufacturers, and drug companies.

  6. Frameworks for evaluating health research capacity strengthening: a qualitative study

    PubMed Central

    2013-01-01

    Background Health research capacity strengthening (RCS) projects are often complex and hard to evaluate. In order to inform health RCS evaluation efforts, we aimed to describe and compare key characteristics of existing health RCS evaluation frameworks: their process of development, purpose, target users, structure, content and coverage of important evaluation issues. A secondary objective was to explore what use had been made of the ESSENCE framework, which attempts to address one such issue: harmonising the evaluation requirements of different funders. Methods We identified and analysed health RCS evaluation frameworks published by seven funding agencies between 2004 and 2012, using a mixed methods approach involving structured qualitative analyses of documents, a stakeholder survey and consultations with key contacts in health RCS funding agencies. Results The frameworks were intended for use predominantly by the organisations themselves, and most were oriented primarily towards funders’ internal organisational performance requirements. The frameworks made limited reference to theories that specifically concern RCS. Generic devices, such as logical frameworks, were typically used to document activities, outputs and outcomes, but with little emphasis on exploring underlying assumptions or contextual constraints. Usage of the ESSENCE framework appeared limited. Conclusions We believe that there is scope for improving frameworks through the incorporation of more accessible information about how to do evaluation in practice; greater involvement of stakeholders, following evaluation capacity building principles; greater emphasis on explaining underlying rationales of frameworks; and structuring frameworks so that they separate generic and project-specific aspects of health RCS evaluation. The third and fourth of these improvements might assist harmonisation. PMID:24330628

  7. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    ERIC Educational Resources Information Center

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  8. Mediation Analysis in a Latent Growth Curve Modeling Framework

    ERIC Educational Resources Information Center

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  9. Conceptualising the effectiveness of impact assessment processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chanchitpricha, Chaunjit, E-mail: chaunjit@g.sut.ac.th; Bond, Alan, E-mail: alan.bond@uea.ac.uk; Unit for Environmental Sciences and Management School of Geo and Spatial Sciences, Internal Box 375, North West University

    2013-11-15

    This paper aims at conceptualising the effectiveness of impact assessment processes through the development of a literature-based framework of criteria to measure impact assessment effectiveness. Four categories of effectiveness were established: procedural, substantive, transactive and normative, each containing a number of criteria; no studies have previously brought together all four of these categories into such a comprehensive, criteria-based framework and undertaken systematic evaluation of practice. The criteria can be mapped within a cycle/or cycles of evaluation, based on the ‘logic model’, at the stages of input, process, output and outcome to enable the identification of connections between the criteria acrossmore » the categories of effectiveness. This framework is considered to have potential application in measuring the effectiveness of many impact assessment processes, including strategic environmental assessment (SEA), environmental impact assessment (EIA), social impact assessment (SIA) and health impact assessment (HIA). -- Highlights: • Conceptualising effectiveness of impact assessment processes. • Identification of factors influencing effectiveness of impact assessment processes. • Development of criteria within a framework for evaluating IA effectiveness. • Applying the logic model to examine connections between effectiveness criteria.« less

  10. Eco-logical successes : second edition, January 2012

    DOT National Transportation Integrated Search

    2012-01-01

    In 2006, leaders from eight Federal agencies signed the interagency document EcoLogical: An Ecosystem Approach to Developing Infrastructure Projects. Eco-Logical is a document that outlines a shared vision of how to develop infrastructure projects in...

  11. Biosensors with Built-In Biomolecular Logic Gates for Practical Applications

    PubMed Central

    Lai, Yu-Hsuan; Sun, Sin-Cih; Chuang, Min-Chieh

    2014-01-01

    Molecular logic gates, designs constructed with biological and chemical molecules, have emerged as an alternative computing approach to silicon-based logic operations. These molecular computers are capable of receiving and integrating multiple stimuli of biochemical significance to generate a definitive output, opening a new research avenue to advanced diagnostics and therapeutics which demand handling of complex factors and precise control. In molecularly gated devices, Boolean logic computations can be activated by specific inputs and accurately processed via bio-recognition, bio-catalysis, and selective chemical reactions. In this review, we survey recent advances of the molecular logic approaches to practical applications of biosensors, including designs constructed with proteins, enzymes, nucleic acids, nanomaterials, and organic compounds, as well as the research avenues for future development of digitally operating “sense and act” schemes that logically process biochemical signals through networked circuits to implement intelligent control systems. PMID:25587423

  12. Microelectromechanical reprogrammable logic device.

    PubMed

    Hafiz, M A A; Kosuru, L; Younis, M I

    2016-03-29

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme.

  13. Microelectromechanical reprogrammable logic device

    PubMed Central

    Hafiz, M. A. A.; Kosuru, L.; Younis, M. I.

    2016-01-01

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme. PMID:27021295

  14. Knowledge acquisition in the fuzzy knowledge representation framework of a medical consultation system.

    PubMed

    Boegl, Karl; Adlassnig, Klaus-Peter; Hayashi, Yoichi; Rothenfluh, Thomas E; Leitich, Harald

    2004-01-01

    This paper describes the fuzzy knowledge representation framework of the medical computer consultation system MedFrame/CADIAG-IV as well as the specific knowledge acquisition techniques that have been developed to support the definition of knowledge concepts and inference rules. As in its predecessor system CADIAG-II, fuzzy medical knowledge bases are used to model the uncertainty and the vagueness of medical concepts and fuzzy logic reasoning mechanisms provide the basic inference processes. The elicitation and acquisition of medical knowledge from domain experts has often been described as the most difficult and time-consuming task in knowledge-based system development in medicine. It comes as no surprise that this is even more so when unfamiliar representations like fuzzy membership functions are to be acquired. From previous projects we have learned that a user-centered approach is mandatory in complex and ill-defined knowledge domains such as internal medicine. This paper describes the knowledge acquisition framework that has been developed in order to make easier and more accessible the three main tasks of: (a) defining medical concepts; (b) providing appropriate interpretations for patient data; and (c) constructing inferential knowledge in a fuzzy knowledge representation framework. Special emphasis is laid on the motivations for some system design and data modeling decisions. The theoretical framework has been implemented in a software package, the Knowledge Base Builder Toolkit. The conception and the design of this system reflect the need for a user-centered, intuitive, and easy-to-handle tool. First results gained from pilot studies have shown that our approach can be successfully implemented in the context of a complex fuzzy theoretical framework. As a result, this critical aspect of knowledge-based system development can be accomplished more easily.

  15. A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models

    DTIC Science & Technology

    2007-11-01

    Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to

  16. Using fuzzy logic to determine the vulnerability of marine species to climate change.

    PubMed

    Jones, Miranda C; Cheung, William W L

    2018-02-01

    Marine species are being impacted by climate change and ocean acidification, although their level of vulnerability varies due to differences in species' sensitivity, adaptive capacity and exposure to climate hazards. Due to limited data on the biological and ecological attributes of many marine species, as well as inherent uncertainties in the assessment process, climate change vulnerability assessments in the marine environment frequently focus on a limited number of taxa or geographic ranges. As climate change is already impacting marine biodiversity and fisheries, there is an urgent need to expand vulnerability assessment to cover a large number of species and areas. Here, we develop a modelling approach to synthesize data on species-specific estimates of exposure, and ecological and biological traits to undertake an assessment of vulnerability (sensitivity and adaptive capacity) and risk of impacts (combining exposure to hazards and vulnerability) of climate change (including ocean acidification) for global marine fishes and invertebrates. We use a fuzzy logic approach to accommodate the variability in data availability and uncertainties associated with inferring vulnerability levels from climate projections and species' traits. Applying the approach to estimate the relative vulnerability and risk of impacts of climate change in 1074 exploited marine species globally, we estimated their index of vulnerability and risk of impacts to be on average 52 ± 19 SD and 66 ± 11 SD, scaling from 1 to 100, with 100 being the most vulnerable and highest risk, respectively, under the 'business-as-usual' greenhouse gas emission scenario (Representative Concentration Pathway 8.5). We identified 157 species to be highly vulnerable while 294 species are identified as being at high risk of impacts. Species that are most vulnerable tend to be large-bodied endemic species. This study suggests that the fuzzy logic framework can help estimate climate vulnerabilities and risks of exploited marine species using publicly and readily available information. © 2017 John Wiley & Sons Ltd.

  17. Reverse engineering of logic-based differential equation models using a mixed-integer dynamic optimization approach

    PubMed Central

    Henriques, David; Rocha, Miguel; Saez-Rodriguez, Julio; Banga, Julio R.

    2015-01-01

    Motivation: Systems biology models can be used to test new hypotheses formulated on the basis of previous knowledge or new experimental data, contradictory with a previously existing model. New hypotheses often come in the shape of a set of possible regulatory mechanisms. This search is usually not limited to finding a single regulation link, but rather a combination of links subject to great uncertainty or no information about the kinetic parameters. Results: In this work, we combine a logic-based formalism, to describe all the possible regulatory structures for a given dynamic model of a pathway, with mixed-integer dynamic optimization (MIDO). This framework aims to simultaneously identify the regulatory structure (represented by binary parameters) and the real-valued parameters that are consistent with the available experimental data, resulting in a logic-based differential equation model. The alternative to this would be to perform real-valued parameter estimation for each possible model structure, which is not tractable for models of the size presented in this work. The performance of the method presented here is illustrated with several case studies: a synthetic pathway problem of signaling regulation, a two-component signal transduction pathway in bacterial homeostasis, and a signaling network in liver cancer cells. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: julio@iim.csic.es or saezrodriguez@ebi.ac.uk PMID:26002881

  18. Reverse engineering of logic-based differential equation models using a mixed-integer dynamic optimization approach.

    PubMed

    Henriques, David; Rocha, Miguel; Saez-Rodriguez, Julio; Banga, Julio R

    2015-09-15

    Systems biology models can be used to test new hypotheses formulated on the basis of previous knowledge or new experimental data, contradictory with a previously existing model. New hypotheses often come in the shape of a set of possible regulatory mechanisms. This search is usually not limited to finding a single regulation link, but rather a combination of links subject to great uncertainty or no information about the kinetic parameters. In this work, we combine a logic-based formalism, to describe all the possible regulatory structures for a given dynamic model of a pathway, with mixed-integer dynamic optimization (MIDO). This framework aims to simultaneously identify the regulatory structure (represented by binary parameters) and the real-valued parameters that are consistent with the available experimental data, resulting in a logic-based differential equation model. The alternative to this would be to perform real-valued parameter estimation for each possible model structure, which is not tractable for models of the size presented in this work. The performance of the method presented here is illustrated with several case studies: a synthetic pathway problem of signaling regulation, a two-component signal transduction pathway in bacterial homeostasis, and a signaling network in liver cancer cells. Supplementary data are available at Bioinformatics online. julio@iim.csic.es or saezrodriguez@ebi.ac.uk. © The Author 2015. Published by Oxford University Press.

  19. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  20. How to develop a theory-driven evaluation design? Lessons learned from an adolescent sexual and reproductive health programme in West Africa.

    PubMed

    Van Belle, Sara B; Marchal, Bruno; Dubourg, Dominique; Kegels, Guy

    2010-11-30

    This paper presents the development of a study design built on the principles of theory-driven evaluation. The theory-driven evaluation approach was used to evaluate an adolescent sexual and reproductive health intervention in Mali, Burkina Faso and Cameroon to improve continuity of care through the creation of networks of social and health care providers. Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework. The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme.

  1. How to develop a theory-driven evaluation design? Lessons learned from an adolescent sexual and reproductive health programme in West Africa

    PubMed Central

    2010-01-01

    Background This paper presents the development of a study design built on the principles of theory-driven evaluation. The theory-driven evaluation approach was used to evaluate an adolescent sexual and reproductive health intervention in Mali, Burkina Faso and Cameroon to improve continuity of care through the creation of networks of social and health care providers. Methods/design Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework. Discussion The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme. PMID:21118510

  2. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  3. Center-TRACON Automation System (CTAS) En Route Trajectory Predictor Requirements and Capabilities

    NASA Technical Reports Server (NTRS)

    Vivona, Robert; Cate, Karen Tung

    2013-01-01

    This requirements framework document is designed to support the capture of requirements and capabilities for state-of-the-art trajectory predictors (TPs). This framework has been developed to assist TP experts in capturing a clear, consistent, and cross-comparable set of requirements and capabilities. The goal is to capture capabilities (types of trajectories that can be built), functional requirements (including inputs and outputs), non-functional requirements (including prediction accuracy and computational performance), approaches for constraint relaxation, and input uncertainties. The sections of this framework are based on the Common Trajectory Predictor structure developed by the FAA/Eurocontrol Cooperative R&D Action Plan 16 Committee on Common Trajectory Prediction. It is assumed that the reader is familiar with the Common TP Structure.1 This initial draft is intended as a first cut capture of the En Route TS Capabilities and Requirements. As such, it contains many annotations indicating possible logic errors in the CTAS code or in the description provided. It is intended to work out the details of the annotations with NASA and to update this document at a later time.

  4. INITIATE: An Intelligent Adaptive Alert Environment.

    PubMed

    Jafarpour, Borna; Abidi, Samina Raza; Ahmad, Ahmad Marwan; Abidi, Syed Sibte Raza

    2015-01-01

    Exposure to a large volume of alerts generated by medical Alert Generating Systems (AGS) such as drug-drug interaction softwares or clinical decision support systems over-whelms users and causes alert fatigue in them. Some of alert fatigue effects are ignoring crucial alerts and longer response times. A common approach to avoid alert fatigue is to devise mechanisms in AGS to stop them from generating alerts that are deemed irrelevant. In this paper, we present a novel framework called INITIATE: an INtellIgent adapTIve AlerT Environment to avoid alert fatigue by managing alerts generated by one or more AGS. We have identified and categories the lifecycle of different alerts and have developed alert management logic as per the alerts' lifecycle. Our framework incorporates an ontology that represents the alert management strategy and an alert management engine that executes this strategy. Our alert management framework offers the following features: (1) Adaptability based on users' feedback; (2) Personalization and aggregation of messages; and (3) Connection to Electronic Medical Records by implementing a HL7 Clinical Document Architecture parser.

  5. Fallacies and fantasies: the theoretical underpinnings of the Coexistence Approach for palaeoclimate reconstruction

    NASA Astrophysics Data System (ADS)

    Grimm, Guido W.; Potts, Alastair J.

    2016-03-01

    The Coexistence Approach has been used to infer palaeoclimates for many Eurasian fossil plant assemblages. However, the theory that underpins the method has never been examined in detail. Here we discuss acknowledged and implicit assumptions and assess the statistical nature and pseudo-logic of the method. We also compare the Coexistence Approach theory with the active field of species distribution modelling. We argue that the assumptions will inevitably be violated to some degree and that the method lacks any substantive means to identify or quantify these violations. The absence of a statistical framework makes the method highly vulnerable to the vagaries of statistical outliers and exotic elements. In addition, we find numerous logical inconsistencies, such as how climate shifts are quantified (the use of a "centre value" of a coexistence interval) and the ability to reconstruct "extinct" climates from modern plant distributions. Given the problems that have surfaced in species distribution modelling, accurate and precise quantitative reconstructions of palaeoclimates (or even climate shifts) using the nearest-living-relative principle and rectilinear niches (the basis of the method) will not be possible. The Coexistence Approach can be summarised as an exercise that shoehorns a plant fossil assemblage into coexistence and then assumes that this must be the climate. Given the theoretical issues and methodological issues highlighted elsewhere, we suggest that the method be discontinued and that all past reconstructions be disregarded and revisited using less fallacious methods. We outline six steps for (further) validation of available and future taxon-based methods and advocate developing (semi-quantitative) methods that prioritise robustness over precision.

  6. A Logic Model for Evaluating the Academic Health Department.

    PubMed

    Erwin, Paul Campbell; McNeely, Clea S; Grubaugh, Julie H; Valentine, Jennifer; Miller, Mark D; Buchanan, Martha

    2016-01-01

    Academic Health Departments (AHDs) are collaborative partnerships between academic programs and practice settings. While case studies have informed our understanding of the development and activities of AHDs, there has been no formal published evaluation of AHDs, either singularly or collectively. Developing a framework for evaluating AHDs has potential to further aid our understanding of how these relationships may matter. In this article, we present a general theory of change, in the form of a logic model, for how AHDs impact public health at the community level. We then present a specific example of how the logic model has been customized for a specific AHD. Finally, we end with potential research questions on the AHD based on these concepts. We conclude that logic models are valuable tools, which can be used to assess the value and ultimate impact of the AHD.

  7. Markov Task Network: A Framework for Service Composition under Uncertainty in Cyber-Physical Systems.

    PubMed

    Mohammed, Abdul-Wahid; Xu, Yang; Hu, Haixiao; Agyemang, Brighter

    2016-09-21

    In novel collaborative systems, cooperative entities collaborate services to achieve local and global objectives. With the growing pervasiveness of cyber-physical systems, however, such collaboration is hampered by differences in the operations of the cyber and physical objects, and the need for the dynamic formation of collaborative functionality given high-level system goals has become practical. In this paper, we propose a cross-layer automation and management model for cyber-physical systems. This models the dynamic formation of collaborative services pursuing laid-down system goals as an ontology-oriented hierarchical task network. Ontological intelligence provides the semantic technology of this model, and through semantic reasoning, primitive tasks can be dynamically composed from high-level system goals. In dealing with uncertainty, we further propose a novel bridge between hierarchical task networks and Markov logic networks, called the Markov task network. This leverages the efficient inference algorithms of Markov logic networks to reduce both computational and inferential loads in task decomposition. From the results of our experiments, high-precision service composition under uncertainty can be achieved using this approach.

  8. Paraconsistent Reasoning for OWL 2

    NASA Astrophysics Data System (ADS)

    Ma, Yue; Hitzler, Pascal

    A four-valued description logic has been proposed to reason with description logic based inconsistent knowledge bases. This approach has a distinct advantage that it can be implemented by invoking classical reasoners to keep the same complexity as under the classical semantics. However, this approach has so far only been studied for the basic description logic mathcal{ALC}. In this paper, we further study how to extend the four-valued semantics to the more expressive description logic mathcal{SROIQ} which underlies the forthcoming revision of the Web Ontology Language, OWL 2, and also investigate how it fares when adapted to tractable description logics including mathcal{EL++}, DL-Lite, and Horn-DLs. We define the four-valued semantics along the same lines as for mathcal{ALC} and show that we can retain most of the desired properties.

  9. The Use of a Predictive Habitat Model and a Fuzzy Logic Approach for Marine Management and Planning

    PubMed Central

    Hattab, Tarek; Ben Rais Lasram, Frida; Albouy, Camille; Sammari, Chérif; Romdhane, Mohamed Salah; Cury, Philippe; Leprieur, Fabien; Le Loc’h, François

    2013-01-01

    Bottom trawl survey data are commonly used as a sampling technique to assess the spatial distribution of commercial species. However, this sampling technique does not always correctly detect a species even when it is present, and this can create significant limitations when fitting species distribution models. In this study, we aim to test the relevance of a mixed methodological approach that combines presence-only and presence-absence distribution models. We illustrate this approach using bottom trawl survey data to model the spatial distributions of 27 commercially targeted marine species. We use an environmentally- and geographically-weighted method to simulate pseudo-absence data. The species distributions are modelled using regression kriging, a technique that explicitly incorporates spatial dependence into predictions. Model outputs are then used to identify areas that met the conservation targets for the deployment of artificial anti-trawling reefs. To achieve this, we propose the use of a fuzzy logic framework that accounts for the uncertainty associated with different model predictions. For each species, the predictive accuracy of the model is classified as ‘high’. A better result is observed when a large number of occurrences are used to develop the model. The map resulting from the fuzzy overlay shows that three main areas have a high level of agreement with the conservation criteria. These results align with expert opinion, confirming the relevance of the proposed methodology in this study. PMID:24146867

  10. Futures of elderly care in Iran: A protocol with scenario approach.

    PubMed

    Goharinezhad, Salime; Maleki, Mohammadreza; Baradaran, Hamid Reza; Ravaghi, Hamid

    2016-01-01

    Background: The number of people aged 60 and older is increasing faster than other age groups worldwide. Iran will experience a sharp aging population increase in the next decades, and this will pose new challenges to the healthcare system. Since providing high quality aged-care services would be the major concern of the policymakers, this question arises that what types of aged care services should be organized in the coming 10 years? This protocol has been designed to develop a set of scenarios for the future of elderly care in Iran. Methods: In this study, intuitive logics approach and Global Business Network (GBN) model were used to develop scenarios for elderly care in Iran. In terms of perspective, the scenarios in this approach are normative, qualitative with respect to methodology and deductive in constructing the process of scenarios. The three phases of GBN model are as follows: 1) Orientation: Identifying strategic levels, stakeholders, participants and time horizon; 2) Exploration: Identifying the driving forces and key uncertainties; 3) Synthesis: Defining the scenario logics and constructing scenario storyline. Results: Presently, two phases are completed and the results will be published in mid-2016. Conclusion: This study delivers a comprehensive framework for taking appropriate actions in providing care for the elderly in the future. Moreover, policy makers should specify and provide the full range of services for the elderly, and in doing so, the scenarios and key findings of this study could be of valuable help.

  11. Learning a Markov Logic network for supervised gene regulatory network inference

    PubMed Central

    2013-01-01

    Background Gene regulatory network inference remains a challenging problem in systems biology despite the numerous approaches that have been proposed. When substantial knowledge on a gene regulatory network is already available, supervised network inference is appropriate. Such a method builds a binary classifier able to assign a class (Regulation/No regulation) to an ordered pair of genes. Once learnt, the pairwise classifier can be used to predict new regulations. In this work, we explore the framework of Markov Logic Networks (MLN) that combine features of probabilistic graphical models with the expressivity of first-order logic rules. Results We propose to learn a Markov Logic network, e.g. a set of weighted rules that conclude on the predicate “regulates”, starting from a known gene regulatory network involved in the switch proliferation/differentiation of keratinocyte cells, a set of experimental transcriptomic data and various descriptions of genes all encoded into first-order logic. As training data are unbalanced, we use asymmetric bagging to learn a set of MLNs. The prediction of a new regulation can then be obtained by averaging predictions of individual MLNs. As a side contribution, we propose three in silico tests to assess the performance of any pairwise classifier in various network inference tasks on real datasets. A first test consists of measuring the average performance on balanced edge prediction problem; a second one deals with the ability of the classifier, once enhanced by asymmetric bagging, to update a given network. Finally our main result concerns a third test that measures the ability of the method to predict regulations with a new set of genes. As expected, MLN, when provided with only numerical discretized gene expression data, does not perform as well as a pairwise SVM in terms of AUPR. However, when a more complete description of gene properties is provided by heterogeneous sources, MLN achieves the same performance as a black-box model such as a pairwise SVM while providing relevant insights on the predictions. Conclusions The numerical studies show that MLN achieves very good predictive performance while opening the door to some interpretability of the decisions. Besides the ability to suggest new regulations, such an approach allows to cross-validate experimental data with existing knowledge. PMID:24028533

  12. Learning a Markov Logic network for supervised gene regulatory network inference.

    PubMed

    Brouard, Céline; Vrain, Christel; Dubois, Julie; Castel, David; Debily, Marie-Anne; d'Alché-Buc, Florence

    2013-09-12

    Gene regulatory network inference remains a challenging problem in systems biology despite the numerous approaches that have been proposed. When substantial knowledge on a gene regulatory network is already available, supervised network inference is appropriate. Such a method builds a binary classifier able to assign a class (Regulation/No regulation) to an ordered pair of genes. Once learnt, the pairwise classifier can be used to predict new regulations. In this work, we explore the framework of Markov Logic Networks (MLN) that combine features of probabilistic graphical models with the expressivity of first-order logic rules. We propose to learn a Markov Logic network, e.g. a set of weighted rules that conclude on the predicate "regulates", starting from a known gene regulatory network involved in the switch proliferation/differentiation of keratinocyte cells, a set of experimental transcriptomic data and various descriptions of genes all encoded into first-order logic. As training data are unbalanced, we use asymmetric bagging to learn a set of MLNs. The prediction of a new regulation can then be obtained by averaging predictions of individual MLNs. As a side contribution, we propose three in silico tests to assess the performance of any pairwise classifier in various network inference tasks on real datasets. A first test consists of measuring the average performance on balanced edge prediction problem; a second one deals with the ability of the classifier, once enhanced by asymmetric bagging, to update a given network. Finally our main result concerns a third test that measures the ability of the method to predict regulations with a new set of genes. As expected, MLN, when provided with only numerical discretized gene expression data, does not perform as well as a pairwise SVM in terms of AUPR. However, when a more complete description of gene properties is provided by heterogeneous sources, MLN achieves the same performance as a black-box model such as a pairwise SVM while providing relevant insights on the predictions. The numerical studies show that MLN achieves very good predictive performance while opening the door to some interpretability of the decisions. Besides the ability to suggest new regulations, such an approach allows to cross-validate experimental data with existing knowledge.

  13. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, N.; Koller, D.; Halpern, J.Y.

    Conditional logics play an important role in recent attempts to investigate default reasoning. This paper investigates first-order conditional logic. We show that, as for first-order probabilistic logic, it is important not to confound statistical conditionals over the domain (such as {open_quotes}most birds fly{close_quotes}), and subjective conditionals over possible worlds (such as I believe that Tweety is unlikely to fly). We then address the issue of ascribing semantics to first-order conditional logic. As in the propositional case, there are many possible semantics. To study the problem in a coherent way, we use plausibility structures. These provide us with a general frameworkmore » in which many of the standard approaches can be embedded. We show that while these standard approaches are all the same at the propositional level, they are significantly different in the context of a first-order language. We show that plausibilities provide the most natural extension of conditional logic to the first-order case: We provide a sound and complete axiomatization that contains only the KLM properties and standard axioms of first-order modal logic. We show that most of the other approaches have additional properties, which result in an inappropriate treatment of an infinitary version of the lottery paradox.« less

  15. The Cinematic Narrator: The Logic and Pragmatics of Impersonal Narration.

    ERIC Educational Resources Information Center

    Burgoyne, Robert

    1990-01-01

    Describes "impersonal narration," an approach that defends the concept of the cinematic narrator as a logical and pragmatic necessity. Compares this approach with existing theories of the cinematic narrator, addressing disagreements in the field of film narrative theory. (MM)

  16. Bit storage and bit flip operations in an electromechanical oscillator.

    PubMed

    Mahboob, I; Yamaguchi, H

    2008-05-01

    The Parametron was first proposed as a logic-processing system almost 50 years ago. In this approach the two stable phases of an excited harmonic oscillator provide the basis for logic operations. Computer architectures based on LC oscillators were developed for this approach, but high power consumption and difficulties with integration meant that the Parametron was rendered obsolete by the transistor. Here we propose an approach to mechanical logic based on nanoelectromechanical systems that is a variation on the Parametron architecture and, as a first step towards a possible nanomechanical computer, we demonstrate both bit storage and bit flip operations.

  17. A comparison of fuzzy logic and cluster renewal approaches for heat transfer modeling in a 1296 t/h CFB boiler with low level of flue gas recirculation

    NASA Astrophysics Data System (ADS)

    Błaszczuk, Artur; Krzywański, Jarosław

    2017-03-01

    The interrelation between fuzzy logic and cluster renewal approaches for heat transfer modeling in a circulating fluidized bed (CFB) has been established based on a local furnace data. The furnace data have been measured in a 1296 t/h CFB boiler with low level of flue gas recirculation. In the present study, the bed temperature and suspension density were treated as experimental variables along the furnace height. The measured bed temperature and suspension density were varied in the range of 1131-1156 K and 1.93-6.32 kg/m3, respectively. Using the heat transfer coefficient for commercial CFB combustor, two empirical heat transfer correlation were developed in terms of important operating parameters including bed temperature and also suspension density. The fuzzy logic results were found to be in good agreement with the corresponding experimental heat transfer data obtained based on cluster renewal approach. The predicted bed-to-wall heat transfer coefficient covered a range of 109-241 W/(m2K) and 111-240 W/(m2K), for fuzzy logic and cluster renewal approach respectively. The divergence in calculated heat flux recovery along the furnace height between fuzzy logic and cluster renewal approach did not exceeded ±2%.

  18. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Causal Mathematical Logic as a guiding framework for the prediction of "Intelligence Signals" in brain simulations

    NASA Astrophysics Data System (ADS)

    Lanzalaco, Felix; Pissanetzky, Sergio

    2013-12-01

    A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype

  20. Replacing positivism in medical geography.

    PubMed

    Bennett, David

    2005-06-01

    Revisiting debates about philosophical approaches in medical geography suggests that logical positivism may have been prematurely discarded. An analysis of authoritative texts in medical geography and their sources in human geography shows that logical positivism has been conflated with Comtean positivism, science, empiricism, quantification, science politics, scientism and so on, to produce the "standard version" of the all-purpose pejorative "positivism", which it is easy to dismiss as an evil. It is argued that the standard version fails to do justice to logical positivism, being constructed on sources which are at some distance from the logical positivist movement itself. An alternative approach is then developed, an historically and geographically situated interpretation of logical positivism as a deliberately and knowingly constructed oppositional epistemology within an oppressive and anti-scientific culture predicated on idealist intuitionism. Contrasting the standard version with this alternative reading of logical positivism suggests that much may have been lost in human, and thus, medical geography, by throwing out the logical positivist baby with the "positivism" bath water. It is concluded that continuing to unpack the standard version of logical positivism may identify benefits from a more nuanced appreciation of logical positivism, but it is premature to take these to the level of detailed impacts on the kinds of medical geographies that could be done or the ways of doing them.

  1. Logic for Physicists

    NASA Astrophysics Data System (ADS)

    Pereyra, Nicolas A.

    2018-06-01

    This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical-math-fundamental oriented approach that is commonly found in mathematical logic textbooks).

  2. Teaching Excellence Framework (TEF): Re-Examining Its Logic and Considering Possible Systemic and Institutional Outcomes

    ERIC Educational Resources Information Center

    Rudd, Tim

    2017-01-01

    This paper offers conceptual and theoretical insights relating to the Teaching Excellence Framework (TEF), highlighting a range of potential systemic and institutional outcomes and issues. The paper is organised around three key areas of discussion that are often under-explored in debates. Firstly, after considering the TEF in the wider context of…

  3. Joining the Club: The Ideology of Quality and Business School Badging

    ERIC Educational Resources Information Center

    Bell, Emma; Taylor, Scott

    2005-01-01

    The ideology of quality and the frameworks used to measure it can profoundly affect academic identity. This article explores the role of quality frameworks in UK business schools, focusing on the way that individuals confront the logic of accreditation when they are subject to its discipline. By defining business schools as an institutional field,…

  4. The BMW Model: A New Framework for Teaching Monetary Economics

    ERIC Educational Resources Information Center

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  5. On some recent definitions and analysis frameworks for risk, vulnerability, and resilience.

    PubMed

    Aven, Terje

    2011-04-01

    Recently, considerable attention has been paid to a systems-based approach to risk, vulnerability, and resilience analysis. It is argued that risk, vulnerability, and resilience are inherently and fundamentally functions of the states of the system and its environment. Vulnerability is defined as the manifestation of the inherent states of the system that can be subjected to a natural hazard or be exploited to adversely affect that system, whereas resilience is defined as the ability of the system to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, and composite costs, and risks. Risk, on the other hand, is probability based, defined by the probability and severity of adverse effects (i.e., the consequences). In this article, we look more closely into this approach. It is observed that the key concepts are inconsistent in the sense that the uncertainty (probability) dimension is included for the risk definition but not for vulnerability and resilience. In the article, we question the rationale for this inconsistency. The suggested approach is compared with an alternative framework that provides a logically defined structure for risk, vulnerability, and resilience, where all three concepts are incorporating the uncertainty (probability) dimension. © 2010 Society for Risk Analysis.

  6. Identifying Interacting Genetic Variations by Fish-Swarm Logic Regression

    PubMed Central

    Yang, Aiyuan; Yan, Chunxia; Zhu, Feng; Zhao, Zhongmeng; Cao, Zhi

    2013-01-01

    Understanding associations between genotypes and complex traits is a fundamental problem in human genetics. A major open problem in mapping phenotypes is that of identifying a set of interacting genetic variants, which might contribute to complex traits. Logic regression (LR) is a powerful multivariant association tool. Several LR-based approaches have been successfully applied to different datasets. However, these approaches are not adequate with regard to accuracy and efficiency. In this paper, we propose a new LR-based approach, called fish-swarm logic regression (FSLR), which improves the logic regression process by incorporating swarm optimization. In our approach, a school of fish agents are conducted in parallel. Each fish agent holds a regression model, while the school searches for better models through various preset behaviors. A swarm algorithm improves the accuracy and the efficiency by speeding up the convergence and preventing it from dropping into local optimums. We apply our approach on a real screening dataset and a series of simulation scenarios. Compared to three existing LR-based approaches, our approach outperforms them by having lower type I and type II error rates, being able to identify more preset causal sites, and performing at faster speeds. PMID:23984382

  7. Furthering the quality agenda in Aboriginal community controlled health services: understanding the relationship between accreditation, continuous quality improvement and national key performance indicator reporting.

    PubMed

    Sibthorpe, Beverly; Gardner, Karen; McAullay, Daniel

    2016-01-01

    A rapidly expanding interest in quality in the Aboriginal-community-controlled health sector has led to widespread uptake of accreditation using more than one set of standards, a proliferation of continuous quality improvement programs and the introduction of key performance indicators. As yet, there has been no overarching logic that shows how they relate to each other, with consequent confusion within and outside the sector. We map the three approaches to the Framework for Performance Assessment in Primary Health Care, demonstrating their key differences and complementarity. There needs to be greater attention in both policy and practice to the purposes and alignment of the three approaches if they are to embed a system-wide focus that supports quality improvement at the service level.

  8. The shuttle main engine: A first look

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1996-01-01

    Anyone entering the Space Shuttle Main Engine (SSME) team attends a two week course to become familiar with the design and workings of the engine. This course provides intensive coverage of the individual hardware items and their functions. Some individuals, particularly those involved with software maintenance and development, have felt overwhelmed by this volume of material and their lack of a logical framework in which to place it. To provide this logical framework, it was decided that a brief self-taught introduction to the overall operation of the SSME should be designed. To aid the people or new team members with an interest in the software, this new course should also explain the structure and functioning of the controller and its software. This paper presents a description of this presentation.

  9. Nonlinear interferometry approach to photonic sequential logic

    NASA Astrophysics Data System (ADS)

    Mabuchi, Hideo

    2011-10-01

    Motivated by rapidly advancing capabilities for extensive nanoscale patterning of optical materials, I propose an approach to implementing photonic sequential logic that exploits circuit-scale phase coherence for efficient realizations of fundamental components such as a NAND-gate-with-fanout and a bistable latch. Kerr-nonlinear optical resonators are utilized in combination with interference effects to drive the binary logic. Quantum-optical input-output models are characterized numerically using design parameters that yield attojoule-scale energy separation between the latch states.

  10. Two-step complete polarization logic Bell-state analysis.

    PubMed

    Sheng, Yu-Bo; Zhou, Lan

    2015-08-26

    The Bell state plays a significant role in the fundamental tests of quantum mechanics, such as the nonlocality of the quantum world. The Bell-state analysis is of vice importance in quantum communication. Existing Bell-state analysis protocols usually focus on the Bell-state encoding in the physical qubit directly. In this paper, we will describe an alternative approach to realize the near complete logic Bell-state analysis for the polarized concatenated Greenberger-Horne-Zeilinger (C-GHZ) state with two logic qubits. We show that the logic Bell-state can be distinguished in two steps with the help of the parity-check measurement (PCM) constructed by the cross-Kerr nonlinearity. This approach can be also used to distinguish arbitrary C-GHZ state with N logic qubits. As both the recent theoretical and experiment work showed that the C-GHZ state has its robust feature in practical noisy environment, this protocol may be useful in future long-distance quantum communication based on the logic-qubit entanglement.

  11. Explorations of Tenth-Grade STS[E] Curricula across Three Provincial Political Landscapes

    NASA Astrophysics Data System (ADS)

    Phillips, Christina Ann

    This thesis focuses on explorations of science, technology, society and the environment (i.e., STS[E]) outcomes/expectations in tenth-grade level science curricula across three Canadian provinces (i.e., Alberta, Manitoba & Ontario) with distinctive provincial political environments at the time of curriculum construction and/or implementation. Document analysis, discourse analysis and a range of theoretical frameworks (i.e., Levinson, 2010; Pedretti & Nazir, 2011 & Krathwohl, 2002) were used to aid in explorations of STS[E] curriculum segments and discourses in each provincial region. More detailed analysis and thematic exploration is presented for each unit associated with climate change as some interesting patterns emerged following initial analysis. My findings are presented as three comparative case studies and represent a small and original contribution to the large body of scholarly research devoted to studies of STS[E] education, where each province represents a unique case that has been explored regarding some aspects the STS[E] curriculum outcomes/expectations and general political culture as well as some other theoretical factors. Findings from this study indicate that Alberta's STS[E] outcomes may be related to Levinson's (2010) 'deliberative' citizenship focus. The following currents from Pedretti and Nazir (2011) appear to be emphasized: logical reasoning, historical, application & design and socio-cultural aligned outcomes when STS[E] is considered as an entity separate from the Alberta curriculum combination of STS and Knowledge. Ontario's STS[E] expectations may align with Levinson's (2010) 'deliberative' or in some select cases a 'deliberative'/'praxis' framework category with some emphasis related to logical reasoning and socio-cultural awareness (Pedretti & Nazir, 2011) in their STS[E] curriculum. The Manitoba STS[E] outcomes may be aligned with a more 'deliberative' approach with some associations that could intersect with the framework categories of 'praxis' or possibly 'dissent and conflict' (Levinson, 2010) and the logical reasoning, socio-cultural and socio-ecojustice currents (Pedretti & Nazir, 2011). General provincial political culture seems to play a limited role in the STS[E] outcomes/expectations as the provinces studied here all tend to align with Levinson's (2010) deliberative citizenship stance (i.e., to varying degrees), with some caveats as explored throughout these cases. A chapter on cross-case analysis follows the three central cases and focuses on the following categories that emerged from this research: STS[E] ontology; STS[E] & citizenship and socio-economic thematic explorations. The final chapter of this thesis focuses on some additional factors and theoretical explorations that may shape STS[E] curricula such as cultural-geographic considerations; educational-political interactions during curriculum construction processes and possible influences from academic scientists. This chapter also provides some recommendations for curriculum development as aligned with case study approaches and provides insights regarding possibilities for future research.

  12. A Modular Approach to Arithmetic and Logic Unit Design on a Reconfigurable Hardware Platform for Educational Purpose

    NASA Astrophysics Data System (ADS)

    Oztekin, Halit; Temurtas, Feyzullah; Gulbag, Ali

    The Arithmetic and Logic Unit (ALU) design is one of the important topics in Computer Architecture and Organization course in Computer and Electrical Engineering departments. There are ALU designs that have non-modular nature to be used as an educational tool. As the programmable logic technology has developed rapidly, it is feasible that ALU design based on Field Programmable Gate Array (FPGA) is implemented in this course. In this paper, we have adopted the modular approach to ALU design based on FPGA. All the modules in the ALU design are realized using schematic structure on Altera's Cyclone II Development board. Under this model, the ALU content is divided into four distinct modules. These are arithmetic unit except for multiplication and division operations, logic unit, multiplication unit and division unit. User can easily design any size of ALU unit since this approach has the modular nature. Then, this approach was applied to microcomputer architecture design named BZK.SAU.FPGA10.0 instead of the current ALU unit.

  13. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews.

    PubMed

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to 'think' conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions.

  14. Predictive genomics: a cancer hallmark network framework for predicting tumor clinical phenotypes using genome sequencing data.

    PubMed

    Wang, Edwin; Zaman, Naif; Mcgee, Shauna; Milanese, Jean-Sébastien; Masoudi-Nejad, Ali; O'Connor-McCourt, Maureen

    2015-02-01

    Tumor genome sequencing leads to documenting thousands of DNA mutations and other genomic alterations. At present, these data cannot be analyzed adequately to aid in the understanding of tumorigenesis and its evolution. Moreover, we have little insight into how to use these data to predict clinical phenotypes and tumor progression to better design patient treatment. To meet these challenges, we discuss a cancer hallmark network framework for modeling genome sequencing data to predict cancer clonal evolution and associated clinical phenotypes. The framework includes: (1) cancer hallmarks that can be represented by a few molecular/signaling networks. 'Network operational signatures' which represent gene regulatory logics/strengths enable to quantify state transitions and measures of hallmark traits. Thus, sets of genomic alterations which are associated with network operational signatures could be linked to the state/measure of hallmark traits. The network operational signature transforms genotypic data (i.e., genomic alterations) to regulatory phenotypic profiles (i.e., regulatory logics/strengths), to cellular phenotypic profiles (i.e., hallmark traits) which lead to clinical phenotypic profiles (i.e., a collection of hallmark traits). Furthermore, the framework considers regulatory logics of the hallmark networks under tumor evolutionary dynamics and therefore also includes: (2) a self-promoting positive feedback loop that is dominated by a genomic instability network and a cell survival/proliferation network is the main driver of tumor clonal evolution. Surrounding tumor stroma and its host immune systems shape the evolutionary paths; (3) cell motility initiating metastasis is a byproduct of the above self-promoting loop activity during tumorigenesis; (4) an emerging hallmark network which triggers genome duplication dominates a feed-forward loop which in turn could act as a rate-limiting step for tumor formation; (5) mutations and other genomic alterations have specific patterns and tissue-specificity, which are driven by aging and other cancer-inducing agents. This framework represents the logics of complex cancer biology as a myriad of phenotypic complexities governed by a limited set of underlying organizing principles. It therefore adds to our understanding of tumor evolution and tumorigenesis, and moreover, potential usefulness of predicting tumors' evolutionary paths and clinical phenotypes. Strategies of using this framework in conjunction with genome sequencing data in an attempt to predict personalized drug targets, drug resistance, and metastasis for cancer patients, as well as cancer risks for healthy individuals are discussed. Accurate prediction of cancer clonal evolution and clinical phenotypes will have substantial impact on timely diagnosis, personalized treatment and personalized prevention of cancer. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  15. Syllogistic reasoning in fuzzy logic and its application to usuality and reasoning with dispositions

    NASA Technical Reports Server (NTRS)

    Zadeh, L. A.

    1985-01-01

    A fuzzy syllogism in fuzzy logic is defined to be an inference schema in which the major premise, the minor premise and the conclusion are propositions containing fuzzy quantifiers. A basic fuzzy syllogism in fuzzy logic is the intersection/product syllogism. Several other basic syllogisms are developed that may be employed as rules of combination of evidence in expert systems. Among these is the consequent conjunction syllogism. Furthermore, it is shown that syllogistic reasoning in fuzzy logic provides a basis for reasoning with dispositions; that is, with propositions that are preponderantly but not necessarily always true. It is also shown that the concept of dispositionality is closely related to the notion of usuality and serves as a basis for what might be called a theory of usuality - a theory which may eventually provide a computational framework for commonsense reasoning.

  16. [A logical framework derived from philosophy of language for analysis of the terms of traditional Chinese medicine and an example for analysis of "kidney essence"].

    PubMed

    Huang, Jian-hua; Li, Wen-wei; Bian, Qin; Shen, Zi-yin

    2011-09-01

    The true meanings of the terms of traditional Chinese medicine (TCM) need to be analyzed on a logical basis. It is not suitable to use a new term to interpret an old term of TCM, or arbitrarily specify the special term of TCM corresponding to some substances of modern medicine. In philosophy of language, language has a logical structure, which reflects the structure of the world, that is to say, language is the picture of the world in a logical sense. Using this idea, the authors collected the ancient literature on "kidney essence", and extracted each necessary condition for "kidney essence". All necessary conditions formed a sufficient condition to define the term "kidney essence". It is expected that this example can show the effectiveness of philosophy of language in analysis of the terms of TCM.

  17. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  18. Extended GTST-MLD for aerospace system safety analysis.

    PubMed

    Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo

    2012-06-01

    The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.

  19. Use of program logic models in the Southern Rural Access Program evaluation.

    PubMed

    Pathman, Donald; Thaker, Samruddhi; Ricketts, Thomas C; Albright, Jennifer B

    2003-01-01

    The Southern Rural Access Program (SRAP) evaluation team used program logic models to clarify grantees' activities, objectives, and timelines. This information was used to benchmark data from grantees' progress reports to assess the program's successes. This article presents a brief background on the use of program logic models--essentially charts or diagrams specifying a program's planned activities, objectives, and goals--for evaluating and managing a program. It discusses the structure of the logic models chosen for the SRAP and how the model concept was introduced to the grantees to promote acceptance and use of the models. The article describes how the models helped clarify the program's objectives and helped lead agencies plan and manage the many program initiatives and subcontractors in their states. Models also provided a framework for grantees to report their progress to the National Program Office and evaluators and promoted the evaluators' visibility and acceptance by the grantees. Program logics, however, increased grantees' reporting requirements and demanded substantial time of the evaluators. Program logic models, on balance, proved their merit in the SRAP through their contributions to its management and evaluation and by providing a better understanding of the program's initiatives, successes, and potential impact.

  20. The Quantification of Consistent Subjective Logic Tree Branch Weights for PSHA

    NASA Astrophysics Data System (ADS)

    Runge, A. K.; Scherbaum, F.

    2012-04-01

    The development of quantitative models for the rate of exceedance of seismically generated ground motion parameters is the target of probabilistic seismic hazard analysis (PSHA). In regions of low to moderate seismicity, the selection and evaluation of source- and/or ground-motion models is often a major challenge to hazard analysts and affected by large epistemic uncertainties. In PSHA this type of uncertainties is commonly treated within a logic tree framework in which the branch weights express the degree-of-belief values of an expert in the corresponding set of models. For the calculation of the distribution of hazard curves, these branch weights are subsequently used as subjective probabilities. However the quality of the results depends strongly on the "quality" of the expert knowledge. A major challenge for experts in this context is to provide weight estimates which are logically consistent (in the sense of Kolmogorov's axioms) and to be aware of and to deal with the multitude of heuristics and biases which affect human judgment under uncertainty. For example, people tend to give smaller weights to each branch of a logic tree the more branches it has, starting with equal weights for all branches and then adjusting this uniform distribution based on his/her beliefs about how the branches differ. This effect is known as pruning bias.¹ A similar unwanted effect, which may even wrongly suggest robustness of the corresponding hazard estimates, will appear in cases where all models are first judged according to some numerical quality measure approach and the resulting weights are subsequently normalized to sum up to one.2 To address these problems, we have developed interactive graphical tools for the determination of logic tree branch weights in form of logically consistent subjective probabilities, based on the concepts suggested in Curtis and Wood (2004).3 Instead of determining the set of weights for all the models in a single step, the computer driven elicitation process is performed as a sequence of evaluations of relative weights for small subsets of models which are presented to the analyst. From these, the distribution of logic tree weights for the whole model set is determined as solution of an optimization problem. The model subset presented to the analyst in each step is designed to maximize the expected information. The result of this process is a set of logically consistent weights together with a measure of confidence determined from the amount of conflicting information which is provided by the expert during the relative weighting process.

  1. Boolean logic tree of graphene-based chemical system for molecular computation and intelligent molecular search query.

    PubMed

    Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing

    2014-05-06

    The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.

  2. Quantum Logic: Approach a Child's Environment from "Inside."

    ERIC Educational Resources Information Center

    Rhodes, William C.

    1987-01-01

    With the advent of quantum mechanics, physics has merged with psychology, and cognitive science has been revolutionized. Quantum logic supports the notion of influencing the environment by increasing the child's capacity for cognitive processing. This special educational approach is theoretically more effective than social and political…

  3. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  4. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  5. A Spiking Neural Network Based Cortex-Like Mechanism and Application to Facial Expression Recognition

    PubMed Central

    Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai

    2012-01-01

    In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism. PMID:23193391

  6. A spiking neural network based cortex-like mechanism and application to facial expression recognition.

    PubMed

    Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai

    2012-01-01

    In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism.

  7. Extracting recurrent scenarios from narrative texts using a Bayesian network: application to serious occupational accidents with movement disturbance.

    PubMed

    Abdat, F; Leclercq, S; Cuny, X; Tissot, C

    2014-09-01

    A probabilistic approach has been developed to extract recurrent serious Occupational Accident with Movement Disturbance (OAMD) scenarios from narrative texts within a prevention framework. Relevant data extracted from 143 accounts was initially coded as logical combinations of generic accident factors. A Bayesian Network (BN)-based model was then built for OAMDs using these data and expert knowledge. A data clustering process was subsequently performed to group the OAMDs into similar classes from generic factor occurrence and pattern standpoints. Finally, the Most Probable Explanation (MPE) was evaluated and identified as the associated recurrent scenario for each class. Using this approach, 8 scenarios were extracted to describe 143 OAMDs in the construction and metallurgy sectors. Their recurrent nature is discussed. Probable generic factor combinations provide a fair representation of particularly serious OAMDs, as described in narrative texts. This work represents a real contribution to raising company awareness of the variety of circumstances, in which these accidents occur, to progressing in the prevention of such accidents and to developing an analysis framework dedicated to this kind of accident. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. THEORIZING HYBRIDITY: INSTITUTIONAL LOGICS, COMPLEX ORGANIZATIONS, AND ACTOR IDENTITIES: THE CASE OF NONPROFITS.

    PubMed

    Skelcher, Chris; Smith, Steven Rathgeb

    2015-06-01

    We propose a novel approach to theorizing hybridity in public and nonprofit organizations. The concept of hybridity is widely used to describe organizational responses to changes in governance, but the literature seldom explains how hybrids arise or what forms they take. Transaction cost and organizational design literatures offer some solutions, but lack a theory of agency. We use the institutional logics approach to theorize hybrids as entities that face a plurality of normative frames. Logics provide symbolic and material elements that structure organizational legitimacy and actor identities. Contradictions between institutional logics offer space for them to be elaborated and creatively reconstructed by situated agents. We propose five types of organizational hybridity - segmented, segregated, assimilated, blended, and blocked. Each type is theoretically derived from empirically observed variations in organizational responses to institutional plurality. We develop propositions to show how our approach to hybridity adds value to academic and policy-maker audiences.

  9. Answer Sets in a Fuzzy Equilibrium Logic

    NASA Astrophysics Data System (ADS)

    Schockaert, Steven; Janssen, Jeroen; Vermeir, Dirk; de Cock, Martine

    Since its introduction, answer set programming has been generalized in many directions, to cater to the needs of real-world applications. As one of the most general “classical” approaches, answer sets of arbitrary propositional theories can be defined as models in the equilibrium logic of Pearce. Fuzzy answer set programming, on the other hand, extends answer set programming with the capability of modeling continuous systems. In this paper, we combine the expressiveness of both approaches, and define answer sets of arbitrary fuzzy propositional theories as models in a fuzzification of equilibrium logic. We show that the resulting notion of answer set is compatible with existing definitions, when the syntactic restrictions of the corresponding approaches are met. We furthermore locate the complexity of the main reasoning tasks at the second level of the polynomial hierarchy. Finally, as an illustration of its modeling power, we show how fuzzy equilibrium logic can be used to find strong Nash equilibria.

  10. Justice, Sacrifice, and the Universal Audience: George Bush's "Address to the Nation Announcing Allied Military Action in the Persian Gulf."

    ERIC Educational Resources Information Center

    Pearce, Kimber Charles; Fadely, Dean

    1992-01-01

    Analyzes the quasi-logical argumentative framework of George Bush's address in which he endeavored to gain compliance and justify his actions at the beginning of the Persian Gulf War. Identifies arguments of comparison and sacrifice within that framework and examines the role of justice in the speech. (TB)

  11. The Interrelations of Features of Questions, Mark Schemes and Examinee Responses and Their Impact upon Marker Agreement

    ERIC Educational Resources Information Center

    Black, Beth; Suto, Irenka; Bramley, Tom

    2011-01-01

    In this paper we develop an evidence-based framework for considering many of the factors affecting marker agreement in GCSEs and A levels. A logical analysis of the demands of the marking task suggests a core grouping comprising: (i) question features; (ii) mark scheme features; and (iii) examinee response features. The framework synthesises…

  12. Embedding Term Similarity and Inverse Document Frequency into a Logical Model of Information Retrieval.

    ERIC Educational Resources Information Center

    Losada, David E.; Barreiro, Alvaro

    2003-01-01

    Proposes an approach to incorporate term similarity and inverse document frequency into a logical model of information retrieval. Highlights include document representation and matching; incorporating term similarity into the measure of distance; new algorithms for implementation; inverse document frequency; and logical versus classical models of…

  13. Leveraging Structure: Logical Necessity in the Context of Integer Arithmetic

    ERIC Educational Resources Information Center

    Bishop, Jessica Pierson; Lamb, Lisa L.; Philipp, Randolph A.; Whitacre, Ian; Schappelle, Bonnie P.

    2016-01-01

    Looking for, recognizing, and using underlying mathematical structure is an important aspect of mathematical reasoning. We explore the use of mathematical structure in children's integer strategies by developing and exemplifying the construct of logical necessity. Students in our study used logical necessity to approach and use numbers in a…

  14. A novel reversible logic gate and its systematic approach to implement cost-efficient arithmetic logic circuits using QCA.

    PubMed

    Ahmad, Peer Zahoor; Quadri, S M K; Ahmad, Firdous; Bahar, Ali Newaz; Wani, Ghulam Mohammad; Tantary, Shafiq Maqbool

    2017-12-01

    Quantum-dot cellular automata, is an extremely small size and a powerless nanotechnology. It is the possible alternative to current CMOS technology. Reversible QCA logic is the most important issue at present time to reduce power losses. This paper presents a novel reversible logic gate called the F-Gate. It is simplest in design and a powerful technique to implement reversible logic. A systematic approach has been used to implement a novel single layer reversible Full-Adder, Full-Subtractor and a Full Adder-Subtractor using the F-Gate. The proposed Full Adder-Subtractor has achieved significant improvements in terms of overall circuit parameters among the most previously cost-efficient designs that exploit the inevitable nano-level issues to perform arithmetic computing. The proposed designs have been authenticated and simulated using QCADesigner tool ver. 2.0.3.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jianguo; Hull, Vanessa; Batistella, Mateus

    Interactions between distant places are increasingly widespread and influential, often leading to unexpected outcomes with profound implications for sustainability. Numerous sustainability studies have been conducted within a particular place with little attention to the impacts of distant interactions on sustainability in multiple places. Although distant forces have been studied, they are usually treated as exogenous variables and feedbacks have been rarely considered. To understand and integrate various distant interactions better, we propose an integrated framework based on telecoupling – an umbrella concept that refers to socioeconomic and environmental interactions over distances. The concept of telecoupling is a logical extension ofmore » research on coupled human and natural systems, in which human and natural systems interact within particular places. The telecoupling framework contains five major interrelated components (coupled human and natural systems, agents, flows, causes, and effects). We illustrate the framework using two examples of distant interactions, highlight the implications of the framework, and discuss research needs and approaches to move research on telecouplings forward. The framework can help better analyze system components and their interrelationships, identify research gaps, detect hidden costs and untapped benefits, provide a useful means to incorporate feedbacks as well as trade-offs and synergies across multiple places (sending, receiving, and spillover systems), and improve the understanding of distant interactions and the effectiveness of policies for socioeconomic and environmental sustainability from local to global levels.« less

  16. Development of a Logic Model for a Physical Activity–Based Employee Wellness Program for Mass Transit Workers

    PubMed Central

    Petruzzello, Steven J.; Ryan, Katherine E.

    2014-01-01

    Transportation workers, who constitute a large sector of the workforce, have worksite factors that harm their health. Worksite wellness programs must target this at-risk population. Although physical activity is often a component of worksite wellness logic models, we consider it the cornerstone for improving the health of mass transit employees. Program theory was based on in-person interviews and focus groups of employees. We identified 4 short-term outcome categories, which provided a chain of responses based on the program activities that should lead to the desired end results. This logic model may have significant public health impact, because it can serve as a framework for other US mass transit districts and worksite populations that face similar barriers to wellness, including truck drivers, railroad employees, and pilots. The objective of this article is to discuss the development of a logic model for a physical activity–based mass-transit employee wellness program by describing the target population, program theory, the components of the logic model, and the process of its development. PMID:25032838

  17. Development of a logic model for a physical activity-based employee wellness program for mass transit workers.

    PubMed

    Das, Bhibha M; Petruzzello, Steven J; Ryan, Katherine E

    2014-07-17

    Transportation workers, who constitute a large sector of the workforce, have worksite factors that harm their health. Worksite wellness programs must target this at-risk population. Although physical activity is often a component of worksite wellness logic models, we consider it the cornerstone for improving the health of mass transit employees. Program theory was based on in-person interviews and focus groups of employees. We identified 4 short-term outcome categories, which provided a chain of responses based on the program activities that should lead to the desired end results. This logic model may have significant public health impact, because it can serve as a framework for other US mass transit districts and worksite populations that face similar barriers to wellness, including truck drivers, railroad employees, and pilots. The objective of this article is to discuss the development of a logic model for a physical activity-based mass-transit employee wellness program by describing the target population, program theory, the components of the logic model, and the process of its development.

  18. Evaluation of Model-Based Training for Vertical Guidance Logic

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.

  19. Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Tran, Minh D.

    2015-05-01

    Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.

  20. Reversible logic gates based on enzyme-biocatalyzed reactions and realized in flow cells: a modular approach.

    PubMed

    Fratto, Brian E; Katz, Evgeny

    2015-05-18

    Reversible logic gates, such as the double Feynman gate, Toffoli gate and Peres gate, with 3-input/3-output channels are realized using reactions biocatalyzed with enzymes and performed in flow systems. The flow devices are constructed using a modular approach, where each flow cell is modified with one enzyme that biocatalyzes one chemical reaction. The multi-step processes mimicking the reversible logic gates are organized by combining the biocatalytic cells in different networks. This work emphasizes logical but not physical reversibility of the constructed systems. Their advantages and disadvantages are discussed and potential use in biosensing systems, rather than in computing devices, is suggested. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Generalized Symbolic Execution for Model Checking and Testing

    NASA Technical Reports Server (NTRS)

    Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)

    2003-01-01

    Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.

  2. Model Checking Degrees of Belief in a System of Agents

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Primero, Giuseppe; Rungta, Neha

    2014-01-01

    Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.

  3. Two logics of policy intervention in immigrant integration: an institutionalist framework based on capabilities and aspirations.

    PubMed

    Lutz, Philipp

    2017-01-01

    The effectiveness of immigrant integration policies has gained considerable attention across Western democracies dealing with ethnically and culturally diverse societies. However, the findings on what type of policy produces more favourable integration outcomes remain inconclusive. The conflation of normative and analytical assumptions on integration is a major challenge for causal analysis of integration policies. This article applies actor-centered institutionalism as a new framework for the analysis of immigrant integration outcomes in order to separate two different mechanisms of policy intervention. Conceptualising integration outcomes as a function of capabilities and aspirations allows separating assumptions on the policy intervention in assimilation and multiculturalism as the two main types of policy approaches. The article illustrates that assimilation is an incentive-based policy and primarily designed to increase immigrants' aspirations, whereas multiculturalism is an opportunity-based policy and primarily designed to increase immigrants' capabilities. Conceptualising causal mechanisms of policy intervention clarifies the link between normative concepts of immigrant integration and analytical concepts of policy effectiveness.

  4. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews

    PubMed Central

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Background Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to ‘think’ conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. Methods and Findings In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Conclusions Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions. PMID:26575182

  5. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  6. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  7. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  8. Synthetic Biology Platform for Sensing and Integrating Endogenous Transcriptional Inputs in Mammalian Cells.

    PubMed

    Angelici, Bartolomeo; Mailand, Erik; Haefliger, Benjamin; Benenson, Yaakov

    2016-08-30

    One of the goals of synthetic biology is to develop programmable artificial gene networks that can transduce multiple endogenous molecular cues to precisely control cell behavior. Realizing this vision requires interfacing natural molecular inputs with synthetic components that generate functional molecular outputs. Interfacing synthetic circuits with endogenous mammalian transcription factors has been particularly difficult. Here, we describe a systematic approach that enables integration and transduction of multiple mammalian transcription factor inputs by a synthetic network. The approach is facilitated by a proportional amplifier sensor based on synergistic positive autoregulation. The circuits efficiently transduce endogenous transcription factor levels into RNAi, transcriptional transactivation, and site-specific recombination. They also enable AND logic between pairs of arbitrary transcription factors. The results establish a framework for developing synthetic gene networks that interface with cellular processes through transcriptional regulators. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.

    PubMed

    Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet

    2018-01-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  10. Serial DNA relay in DNA logic gates by electrical fusion and mechanical splitting of droplets

    PubMed Central

    Kawano, Ryuji; Takinoue, Masahiro; Osaki, Toshihisa; Kamiya, Koki; Miki, Norihisa

    2017-01-01

    DNA logic circuits utilizing DNA hybridization and/or enzymatic reactions have drawn increasing attention for their potential applications in the diagnosis and treatment of cellular diseases. The compartmentalization of such a system into a microdroplet considerably helps to precisely regulate local interactions and reactions between molecules. In this study, we introduced a relay approach for enabling the transfer of DNA from one droplet to another to implement multi-step sequential logic operations. We proposed electrical fusion and mechanical splitting of droplets to facilitate the DNA flow at the inputs, logic operation, output, and serial connection between two logic gates. We developed Negative-OR operations integrated by a serial connection of the OR gate and NOT gate incorporated in a series of droplets. The four types of input defined by the presence/absence of DNA in the input droplet pair were correctly reflected in the readout at the Negative-OR gate. The proposed approach potentially allows for serial and parallel logic operations that could be used for complex diagnostic applications. PMID:28700641

  11. Determination of Factors Related to Students' Understandings of Heat, Temperature and Internal Energy Concepts

    ERIC Educational Resources Information Center

    Gurcay, Deniz; Gulbas, Etna

    2018-01-01

    The purpose of this research is to investigate the relationships between high school students' learning approaches and logical thinking abilities and their understandings of heat, temperature and internal energy concepts. Learning Approach Questionnaire, Test of Logical Thinking and Three-Tier Heat, Temperature and Internal Energy Test were used…

  12. A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data.

    PubMed

    Kahn, Michael G; Callahan, Tiffany J; Barnard, Juliana; Bauck, Alan E; Brown, Jeff; Davidson, Bruce N; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G; Weng, Chunhua; Zozus, Meredith N; Schilling, Lisa

    2016-01-01

    Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is 'fit' for specific uses. DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework's inclusiveness was evaluated against ten published DQ terminologies. Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods.

  13. Texas traffic thermostat software tool.

    DOT National Transportation Integrated Search

    2013-04-01

    The traffic thermostat decision tool is built to help guide the user through a logical, step-wise, process of examining potential changes to their Manage Lane/toll facility. : **NOTE: Project Title: Application of the Traffic Thermostat Framework. Ap...

  14. Texas traffic thermostat marketing package.

    DOT National Transportation Integrated Search

    2013-04-01

    The traffic thermostat decision tool is built to help guide the user through a logical, step-wise, process of examining potential changes to their Manage Lane/toll facility. : **NOTE: Project Title: Application of the Traffic Thermostat Framework. Ap...

  15. A process-based framework to guide nurse practitioners integration into primary healthcare teams: results from a logic analysis.

    PubMed

    Contandriopoulos, Damien; Brousselle, Astrid; Dubois, Carl-Ardy; Perroux, Mélanie; Beaulieu, Marie-Dominique; Brault, Isabelle; Kilpatrick, Kelley; D'Amour, Danielle; Sansgter-Gormley, Esther

    2015-02-27

    Integrating Nurse Practitioners into primary care teams is a process that involves significant challenges. To be successful, nurse practitioner integration into primary care teams requires, among other things, a redefinition of professional boundaries, in particular those of medicine and nursing, a coherent model of inter- and intra- professional collaboration, and team-based work processes that make the best use of the subsidiarity principle. There have been numerous studies on nurse practitioner integration, and the literature provides a comprehensive list of barriers to, and facilitators of, integration. However, this literature is much less prolific in discussing the operational level implications of those barriers and facilitators and in offering practical recommendations. In the context of a large-scale research project on the introduction of nurse practitioners in Quebec (Canada) we relied on a logic-analysis approach based, on the one hand on a realist review of the literature and, on the other hand, on qualitative case-studies in 6 primary healthcare teams in rural and urban area of Quebec. Five core themes that need to be taken into account when integrating nurse practitioners into primary care teams were identified. Those themes are: planning, role definition, practice model, collaboration, and team support. The present paper has two objectives: to present the methods used to develop the themes, and to discuss an integrative model of nurse practitioner integration support centered around these themes. It concludes with a discussion of how this framework contributes to existing knowledge and some ideas for future avenues of study.

  16. A Complex Systems Model Approach to Quantified Mineral Resource Appraisal

    USGS Publications Warehouse

    Gettings, M.E.; Bultman, M.W.; Fisher, F.S.

    2004-01-01

    For federal and state land management agencies, mineral resource appraisal has evolved from value-based to outcome-based procedures wherein the consequences of resource development are compared with those of other management options. Complex systems modeling is proposed as a general framework in which to build models that can evaluate outcomes. Three frequently used methods of mineral resource appraisal (subjective probabilistic estimates, weights of evidence modeling, and fuzzy logic modeling) are discussed to obtain insight into methods of incorporating complexity into mineral resource appraisal models. Fuzzy logic and weights of evidence are most easily utilized in complex systems models. A fundamental product of new appraisals is the production of reusable, accessible databases and methodologies so that appraisals can easily be repeated with new or refined data. The data are representations of complex systems and must be so regarded if all of their information content is to be utilized. The proposed generalized model framework is applicable to mineral assessment and other geoscience problems. We begin with a (fuzzy) cognitive map using (+1,0,-1) values for the links and evaluate the map for various scenarios to obtain a ranking of the importance of various links. Fieldwork and modeling studies identify important links and help identify unanticipated links. Next, the links are given membership functions in accordance with the data. Finally, processes are associated with the links; ideally, the controlling physical and chemical events and equations are found for each link. After calibration and testing, this complex systems model is used for predictions under various scenarios.

  17. Generalized Philosophy of Alerting with Applications for Parallel Approach Collision Prevention

    NASA Technical Reports Server (NTRS)

    Winder, Lee F.; Kuchar, James K.

    2000-01-01

    The goal of the research was to develop formal guidelines for the design of hazard avoidance systems. An alerting system is automation designed to reduce the likelihood of undesirable outcomes that are due to rare failures in a human-controlled system. It accomplishes this by monitoring the system, and issuing warning messages to the human operators when thought necessary to head off a problem. On examination of existing and recently proposed logics for alerting it appears that few commonly accepted principles guide the design process. Different logics intended to address the same hazards may take disparate forms and emphasize different aspects of performance, because each reflects the intuitive priorities of a different designer. Because performance must be satisfactory to all users of an alerting system (implying a universal meaning of acceptable performance) and not just one designer, a proposed logic often undergoes significant piecemeal modification before gamma general acceptance. This report is an initial attempt to clarify the common performance goals by which an alerting system is ultimately judged. A better understanding of these goals will hopefully allow designers to reach the final logic in a quicker, more direct and repeatable manner. As a case study, this report compares three alerting logics for collision prevention during independent approaches to parallel runways, and outlines a fourth alternative incorporating elements of the first three, but satisfying stated requirements. Three existing logics for parallel approach alerting are described. Each follows from different intuitive principles. The logics are presented as examples of three "philosophies" of alerting system design.

  18. Assessment of groundwater vulnerability using supervised committee to combine fuzzy logic models.

    PubMed

    Nadiri, Ata Allah; Gharekhani, Maryam; Khatibi, Rahman; Moghaddam, Asghar Asghari

    2017-03-01

    Vulnerability indices of an aquifer assessed by different fuzzy logic (FL) models often give rise to differing values with no theoretical or empirical basis to establish a validated baseline or to develop a comparison basis between the modeling results and baselines, if any. Therefore, this research presents a supervised committee fuzzy logic (SCFL) method, which uses artificial neural networks to overarch and combine a selection of FL models. The indices are expressed by the widely used DRASTIC framework, which include geological, hydrological, and hydrogeological parameters often subject to uncertainty. DRASTIC indices represent collectively intrinsic (or natural) vulnerability and give a sense of contaminants, such as nitrate-N, percolating to aquifers from the surface. The study area is an aquifer in Ardabil plain, the province of Ardabil, northwest Iran. Improvements on vulnerability indices are achieved by FL techniques, which comprise Sugeno fuzzy logic (SFL), Mamdani fuzzy logic (MFL), and Larsen fuzzy logic (LFL). As the correlation between estimated DRASTIC vulnerability index values and nitrate-N values is as low as 0.4, it is improved significantly by FL models (SFL, MFL, and LFL), which perform in similar ways but have differences. Their synergy is exploited by SCFL and uses the FL modeling results "conditioned" by nitrate-N values to raise their correlation to higher than 0.9.

  19. All optical programmable logic array (PLA)

    NASA Astrophysics Data System (ADS)

    Hiluf, Dawit

    2018-03-01

    A programmable logic array (PLA) is an integrated circuit (IC) logic device that can be reconfigured to implement various kinds of combinational logic circuits. The device has a number of AND and OR gates which are linked together to give output or further combined with more gates or logic circuits. This work presents the realization of PLAs via the physics of a three level system interacting with light. A programmable logic array is designed such that a number of different logical functions can be combined as a sum-of-product or product-of-sum form. We present an all optical PLAs with the aid of laser light and observables of quantum systems, where encoded information can be considered as memory chip. The dynamics of the physical system is investigated using Lie algebra approach.

  20. Logical inference approach to relativistic quantum mechanics: Derivation of the Klein–Gordon equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donker, H.C., E-mail: h.donker@science.ru.nl; Katsnelson, M.I.; De Raedt, H.

    2016-09-15

    The logical inference approach to quantum theory, proposed earlier De Raedt et al. (2014), is considered in a relativistic setting. It is shown that the Klein–Gordon equation for a massive, charged, and spinless particle derives from the combination of the requirements that the space–time data collected by probing the particle is obtained from the most robust experiment and that on average, the classical relativistic equation of motion of a particle holds. - Highlights: • Logical inference applied to relativistic, massive, charged, and spinless particle experiments leads to the Klein–Gordon equation. • The relativistic Hamilton–Jacobi is scrutinized by employing a field description formore » the four-velocity. • Logical inference allows analysis of experiments with uncertainty in detection events and experimental conditions.« less

  1. Pragmatics of policy: the compliance of dutch environmental policy instruments to European union standards.

    PubMed

    Kruitwagen, Sonja; Reudink, Melchert; Faber, Albert

    2009-04-01

    Despite a general decrease in Dutch environmental emission trends, it remains difficult to comply with European Union (EU) environmental policy targets. Furthermore, environmental issues have become increasingly complex and entangled with society. Therefore, Dutch environmental policy follows a pragmatic line by adopting a flexible approach for compliance, rather than aiming at further reduction at the source of emission. This may be politically useful in order to adequately reach EU targets, but restoration of environmental conditions may be delayed. However, due to the complexity of today's environmental issues, the restoration of environmental conditions might not be the only standard for a proper policy approach. Consequently this raises the question how the Dutch pragmatic approach to compliance qualifies in a broader policy assessment. In order to answer this question, we adapt a policy assessment framework, developed by Hemerijck and Hazeu (Bestuurskunde 13(2), 2004), based on the dimensions of legitimacy and policy logic. We apply this framework for three environmental policy assessments: flexible instruments in climate policy, fine-tuning of national and local measures to meet air quality standards, and derogation for the Nitrate Directive. We conclude with general assessment notes on the appliance of flexible instruments in environmental policy, showing that a broad and comprehensive perspective can help to understand the arguments to put such policy instruments into place and to identify trade-offs between assessment criteria.

  2. To manage inland fisheries is to manage at the social-ecological watershed scale.

    PubMed

    Nguyen, Vivian M; Lynch, Abigail J; Young, Nathan; Cowx, Ian G; Beard, T Douglas; Taylor, William W; Cooke, Steven J

    2016-10-01

    Approaches to managing inland fisheries vary between systems and regions but are often based on large-scale marine fisheries principles and thus limited and outdated. Rarely do they adopt holistic approaches that consider the complex interplay among humans, fish, and the environment. We argue that there is an urgent need for a shift in inland fisheries management towards holistic and transdisciplinary approaches that embrace the principles of social-ecological systems at the watershed scale. The interconnectedness of inland fisheries with their associated watershed (biotic, abiotic, and humans) make them extremely complex and challenging to manage and protect. For this reason, the watershed is a logical management unit. To assist management at this scale, we propose a framework that integrates disparate concepts and management paradigms to facilitate inland fisheries management and sustainability. We contend that inland fisheries need to be managed as social-ecological watershed system (SEWS). The framework supports watershed-scale and transboundary governance to manage inland fisheries, and transdisciplinary projects and teams to ensure relevant and applicable monitoring and research. We discuss concepts of social-ecological feedback and interactions of multiple stressors and factors within/between the social-ecological systems. Moreover, we emphasize that management, monitoring, and research on inland fisheries at the watershed scale are needed to ensure long-term sustainable and resilient fisheries. Copyright © 2016. Published by Elsevier Ltd.

  3. The effect of output-input isolation on the scaling and energy consumption of all-spin logic devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Jiaxi; Haratipour, Nazila; Koester, Steven J., E-mail: skoester@umn.edu

    All-spin logic (ASL) is a novel approach for digital logic applications wherein spin is used as the state variable instead of charge. One of the challenges in realizing a practical ASL system is the need to ensure non-reciprocity, meaning the information flows from input to output, not vice versa. One approach described previously, is to introduce an asymmetric ground contact, and while this approach was shown to be effective, it remains unclear as to the optimal approach for achieving non-reciprocity in ASL. In this study, we quantitatively analyze techniques to achieve non-reciprocity in ASL devices, and we specifically compare themore » effect of using asymmetric ground position and dipole-coupled output/input isolation. For this analysis, we simulate the switching dynamics of multiple-stage logic devices with FePt and FePd perpendicular magnetic anisotropy materials using a combination of a matrix-based spin circuit model coupled to the Landau–Lifshitz–Gilbert equation. The dipole field is included in this model and can act as both a desirable means of coupling magnets and a source of noise. The dynamic energy consumption has been calculated for these schemes, as a function of input/output magnet separation, and the results show that using a scheme that electrically isolates logic stages produces superior non-reciprocity, thus allowing both improved scaling and reduced energy consumption.« less

  4. THEORIZING HYBRIDITY: INSTITUTIONAL LOGICS, COMPLEX ORGANIZATIONS, AND ACTOR IDENTITIES: THE CASE OF NONPROFITS

    PubMed Central

    SKELCHER, CHRIS; SMITH, STEVEN RATHGEB

    2015-01-01

    We propose a novel approach to theorizing hybridity in public and nonprofit organizations. The concept of hybridity is widely used to describe organizational responses to changes in governance, but the literature seldom explains how hybrids arise or what forms they take. Transaction cost and organizational design literatures offer some solutions, but lack a theory of agency. We use the institutional logics approach to theorize hybrids as entities that face a plurality of normative frames. Logics provide symbolic and material elements that structure organizational legitimacy and actor identities. Contradictions between institutional logics offer space for them to be elaborated and creatively reconstructed by situated agents. We propose five types of organizational hybridity – segmented, segregated, assimilated, blended, and blocked. Each type is theoretically derived from empirically observed variations in organizational responses to institutional plurality. We develop propositions to show how our approach to hybridity adds value to academic and policy-maker audiences. PMID:26640298

  5. Combining Domain-driven Design and Mashups for Service Development

    NASA Astrophysics Data System (ADS)

    Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni

    This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.

  6. A logical approach to semantic interoperability in healthcare.

    PubMed

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  7. A Logical Approach to the Statement of Cash Flows

    ERIC Educational Resources Information Center

    Petro, Fred; Gean, Farrell

    2014-01-01

    Of the three financial statements in financial reporting, the Statement of Cash Flows (SCF) is perhaps the most challenging. The most difficult aspect of the SCF is in developing an understanding of how previous transactions are finalized in this document. The purpose of this paper is to logically explain the indirect approach of cash flow whereby…

  8. Introduction to Papers from the 5th Workshop on Language Production: The Neural Bases of Language Production

    ERIC Educational Resources Information Center

    Rapp, Brenda; Miozzo, Michele

    2011-01-01

    The papers in this special issue of "Language and Cognitive Processing" on the neural bases of language production illustrate two general approaches in current cognitive neuroscience. One approach focuses on investigating cognitive issues, making use of the logic of associations/dissociations or the logic of neural markers as key investigative…

  9. Critical Analysis of Textbooks: Knowledge-Generating Logics and the Emerging Image of "Global Economic Contexts"

    ERIC Educational Resources Information Center

    Thoma, Michael

    2017-01-01

    This paper presents an approach to the critical analysis of textbook knowledge, which, working from a discourse theory perspective (based on the work of Foucault), refers to the performative nature of language. The critical potential of the approach derives from an analysis of knowledge-generating logics, which produce particular images of reality…

  10. Science, a Psychological versus a Logical Approach in Teaching

    ERIC Educational Resources Information Center

    Ediger, Marlow

    2015-01-01

    Under which approach do pupils attain more optimally, a logical versus a psychological procedure of instruction? Pupils do need to achieve well in a world of science. Science is all around us and pupils need to understand various principles and laws of science. Thus, teachers in the school curriculum must choose carefully objectives for pupil…

  11. Logical-Rule Models of Classification Response Times: A Synthesis of Mental-Architecture, Random-Walk, and Decision-Bound Approaches

    ERIC Educational Resources Information Center

    Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.

    2010-01-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…

  12. Metalevel programming in robotics: Some issues

    NASA Technical Reports Server (NTRS)

    Kumarn, A.; Parameswaran, N.

    1987-01-01

    Computing in robotics has two important requirements: efficiency and flexibility. Algorithms for robot actions are implemented usually in procedural languages such as VAL and AL. But, since their excessive bindings create inflexible structures of computation, it is proposed that Logic Programming is a more suitable language for robot programming due to its non-determinism, declarative nature, and provision for metalevel programming. Logic Programming, however, results in inefficient computations. As a solution to this problem, researchers discuss a framework in which controls can be described to improve efficiency. They have divided controls into: (1) in-code and (2) metalevel and discussed them with reference to selection of rules and dataflow. Researchers illustrated the merit of Logic Programming by modelling the motion of a robot from one point to another avoiding obstacles.

  13. Chol understandings of suicide and human agency.

    PubMed

    Imberton, Gracia

    2012-06-01

    According to ethnographic material collected since 2003, the Chol Mayan indigenous people in southern Mexico have different causal explanations for suicide. It can be attributed to witchcraft that forces victims to take their lives against their own will, to excessive drinking, or to fate determined by God. However, it can also be conceived of as a conscious decision made by a person overwhelmed by daily problems. Drawing from the theoretical framework developed by Laura M. Ahearn, inspired by practice theory, the paper contends that these different explanations operate within two different logics or understandings of human agency. The first logic attributes responsibility to supernatural causes such as witchcraft or divine destiny, and reflects Chol notions of personhood. The second logic accepts personal responsibility for suicide, and is related to processes of social change such as the introduction of wage labor, education and a market economy. The contemporary Chol resort to both logics to make sense of the human drama of suicide.

  14. Towards Resilient Critical Infrastructures: Application of Type-2 Fuzzy Logic in Embedded Network Security Cyber Sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Jim Alves-Foss

    2011-08-01

    Resiliency and cyber security of modern critical infrastructures is becoming increasingly important with the growing number of threats in the cyber-environment. This paper proposes an extension to a previously developed fuzzy logic based anomaly detection network security cyber sensor via incorporating Type-2 Fuzzy Logic (T2 FL). In general, fuzzy logic provides a framework for system modeling in linguistic form capable of coping with imprecise and vague meanings of words. T2 FL is an extension of Type-1 FL which proved to be successful in modeling and minimizing the effects of various kinds of dynamic uncertainties. In this paper, T2 FL providesmore » a basis for robust anomaly detection and cyber security state awareness. In addition, the proposed algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental cyber-security test-bed.« less

  15. Test aspects of the JPL Viterbi decoder

    NASA Technical Reports Server (NTRS)

    Breuer, M. A.

    1989-01-01

    The generation of test vectors and design-for-test aspects of the Jet Propulsion Laboratory (JPL) Very Large Scale Integration (VLSI) Viterbi decoder chip is discussed. Each processor integrated circuit (IC) contains over 20,000 gates. To achieve a high degree of testability, a scan architecture is employed. The logic has been partitioned so that very few test vectors are required to test the entire chip. In addition, since several blocks of logic are replicated numerous times on this chip, test vectors need only be generated for each block, rather than for the entire circuit. These unique blocks of logic have been identified and test sets generated for them. The approach employed for testing was to use pseudo-exhaustive test vectors whenever feasible. That is, each cone of logid is tested exhaustively. Using this approach, no detailed logic design or fault model is required. All faults which modify the function of a block of combinational logic are detected, such as all irredundant single and multiple stuck-at faults.

  16. Nature and place of crime scene management within forensic sciences.

    PubMed

    Crispino, Frank

    2008-03-01

    This short paper presents the preliminary results of a recent study aimed at appreciating the relevant parameters required to qualify forensic science as a science through an epistemological analysis. The reader is invited to reflect upon references within a historical and logical framework which assert that forensic science is based upon two fundamental principles (those of Locard and Kirk). The basis of the assertion that forensic science is indeed a science should be appreciated not only on one epistemological criteria (as Popper's falsification raised by the Daubert hearing was), but also on the logical frameworks used by the individuals involved (investigator, expert witness and trier of fact) from the crime scene examination to the final interpretation of the evidence. Hence, it can be argued that the management of the crime scene should be integrated into the scientific way of thinking rather than remain as a technical discipline as recently suggested by Harrison.

  17. An acceleration framework for synthetic aperture radar algorithms

    NASA Astrophysics Data System (ADS)

    Kim, Youngsoo; Gloster, Clay S.; Alexander, Winser E.

    2017-04-01

    Algorithms for radar signal processing, such as Synthetic Aperture Radar (SAR) are computationally intensive and require considerable execution time on a general purpose processor. Reconfigurable logic can be used to off-load the primary computational kernel onto a custom computing machine in order to reduce execution time by an order of magnitude as compared to kernel execution on a general purpose processor. Specifically, Field Programmable Gate Arrays (FPGAs) can be used to accelerate these kernels using hardware-based custom logic implementations. In this paper, we demonstrate a framework for algorithm acceleration. We used SAR as a case study to illustrate the potential for algorithm acceleration offered by FPGAs. Initially, we profiled the SAR algorithm and implemented a homomorphic filter using a hardware implementation of the natural logarithm. Experimental results show a linear speedup by adding reasonably small processing elements in Field Programmable Gate Array (FPGA) as opposed to using a software implementation running on a typical general purpose processor.

  18. Hierarchical semantic structures for medical NLP.

    PubMed

    Taira, Ricky K; Arnold, Corey W

    2013-01-01

    We present a framework for building a medical natural language processing (NLP) system capable of deep understanding of clinical text reports. The framework helps developers understand how various NLP-related efforts and knowledge sources can be integrated. The aspects considered include: 1) computational issues dealing with defining layers of intermediate semantic structures to reduce the dimensionality of the NLP problem; 2) algorithmic issues in which we survey the NLP literature and discuss state-of-the-art procedures used to map between various levels of the hierarchy; and 3) implementation issues to software developers with available resources. The objective of this poster is to educate readers to the various levels of semantic representation (e.g., word level concepts, ontological concepts, logical relations, logical frames, discourse structures, etc.). The poster presents an architecture for which diverse efforts and resources in medical NLP can be integrated in a principled way.

  19. Graphical approach for multiple values logic minimization

    NASA Astrophysics Data System (ADS)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  20. Toward a Postmodern Pragmatic Discourse Semioethics for Brain Injury Care: Empirically Driven Group Inquiry as a Dialogical Practice in Pursuit of the Peircean Aesthetic Ideal of 'Reasonableness'.

    PubMed

    Goldberg, Gary

    2017-05-01

    A postmodern framework is proposed for conceptualizing the impact of brain injury on the subjective being of the injured person. Semiosis, the 'action of signs,' is argued as necessary for this recovery of subjectivity that escapes the mechanistic materialism and mind-matter dualism of modern science. Ethical dilemmas in brain injury care are best approached through an empirical 'semioethics' implemented as a dialogical practice among a group of selected stakeholders seeking a logical solution that best addresses the criterion of maximizing reasonableness as a tempering of rationality with relational concerns in the face of the constraints imposed by the injury. Published by Elsevier Inc.

  1. Braking the bandwagon: scrutinizing the science and politics of empirically supported therapies.

    PubMed

    Hagemoser, Steven D

    2009-12-01

    Proponents of empirically supported therapies (ESTs) argue that because manualized ESTs have demonstrated efficacy in treating a range of psychological disorders, they should be the treatments of choice. In this article, the author uses a hypothetical treatment for obesity to highlight numerous flaws in EST logic and argues for common factors as a more clinically relevant but empirically challenging approach. The author then explores how political variables may be contributing to the expansion of EST and the resulting restriction of practitioner autonomy. Last, the author argues that EST is best viewed as 1 component of a more comprehensive evidence-based practice framework. The author concludes with some cautionary statements about the perils of equating the EST paradigm with the scientist-practitioner ideal.

  2. Reply [to “Comment on ‘The Zen of Venn’” by Priestley Toulmin

    NASA Astrophysics Data System (ADS)

    Berkman, Paul Arthur

    While Venn diagrams, “strictly speaking,” may not have been designed for the “peritechnical literature” they certainly provide a symbolic framework for integrating concepts beyond the context of “mathematically defined objects.” It is interesting that Toulmin was offended and compelled to protest the application of Venn diagrams that are not bound by his “valid methodology.” Such disciplinary constraints on creativity appear contrary to the original writings of John Venn who esteemed interdisciplinary approaches and argued fiercely against those who objected to his introducing mathematical symbols into logic [Venn, 1894]. “Symbolic Logic” itself was crafted with a view toward a general utility “in the solution of complicated problems” [Venn, 1894].

  3. Approach for Autonomous Control of Unmanned Aerial Vehicle Using Intelligent Agents for Knowledge Creation

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren R., Jr.

    2004-01-01

    This paper describes the development of a planned approach for Autonomous operation of an Unmanned Aerial Vehicle (UAV). A Hybrid approach will seek to provide Knowledge Generation through the application of Artificial Intelligence (AI) and Intelligent Agents (IA) for UAV control. The applications of several different types of AI techniques for flight are explored during this research effort. The research concentration is directed to the application of different AI methods within the UAV arena. By evaluating AI and biological system approaches. which include Expert Systems, Neural Networks. Intelligent Agents, Fuzzy Logic, and Complex Adaptive Systems, a new insight may be gained into the benefits of AI and CAS techniques applied to achieving true autonomous operation of these systems. Although flight systems were explored, the benefits should apply to many Unmanned Vehicles such as: Rovers. Ocean Explorers, Robots, and autonomous operation systems. A portion of the flight system is broken down into control agents that represent the intelligent agent approach used in AI. After the completion of a successful approach, a framework for applying an intelligent agent is presented. The initial results from simulation of a security agent for communication are presented.

  4. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  5. A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data

    PubMed Central

    Kahn, Michael G.; Callahan, Tiffany J.; Barnard, Juliana; Bauck, Alan E.; Brown, Jeff; Davidson, Bruce N.; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G.; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C.; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G.; Weng, Chunhua; Zozus, Meredith N.; Schilling, Lisa

    2016-01-01

    Objective: Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is ‘fit’ for specific uses. Materials and Methods: DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework’s inclusiveness was evaluated against ten published DQ terminologies. Results: Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Discussion: Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. Conclusion: A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods. PMID:27713905

  6. Model Checking Temporal Logic Formulas Using Sticker Automata

    PubMed Central

    Feng, Changwei; Wu, Huanmei

    2017-01-01

    As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114

  7. Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach

    PubMed Central

    Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.

    2013-01-01

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756

  8. Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.

    PubMed

    Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H

    2013-01-09

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Time-space modal logic for verification of bit-slice circuits

    NASA Astrophysics Data System (ADS)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  10. On the formalization and reuse of scientific research.

    PubMed

    King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N

    2011-10-07

    The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.

  11. Instantons in Self-Organizing Logic Gates

    NASA Astrophysics Data System (ADS)

    Bearden, Sean R. B.; Manukian, Haik; Traversa, Fabio L.; Di Ventra, Massimiliano

    2018-03-01

    Self-organizing logic is a recently suggested framework that allows the solution of Boolean truth tables "in reverse"; i.e., it is able to satisfy the logical proposition of gates regardless to which terminal(s) the truth value is assigned ("terminal-agnostic logic"). It can be realized if time nonlocality (memory) is present. A practical realization of self-organizing logic gates (SOLGs) can be done by combining circuit elements with and without memory. By employing one such realization, we show, numerically, that SOLGs exploit elementary instantons to reach equilibrium points. Instantons are classical trajectories of the nonlinear equations of motion describing SOLGs and connect topologically distinct critical points in the phase space. By linear analysis at those points, we show that these instantons connect the initial critical point of the dynamics, with at least one unstable direction, directly to the final fixed point. We also show that the memory content of these gates affects only the relaxation time to reach the logically consistent solution. Finally, we demonstrate, by solving the corresponding stochastic differential equations, that, since instantons connect critical points, noise and perturbations may change the instanton trajectory in the phase space but not the initial and final critical points. Therefore, even for extremely large noise levels, the gates self-organize to the correct solution. Our work provides a physical understanding of, and can serve as an inspiration for, models of bidirectional logic gates that are emerging as important tools in physics-inspired, unconventional computing.

  12. Development of a Clinical Framework for Mirror Therapy in Patients with Phantom Limb Pain: An Evidence-based Practice Approach.

    PubMed

    Rothgangel, Andreas; Braun, Susy; de Witte, Luc; Beurskens, Anna; Smeets, Rob

    2016-04-01

    To describe the development and content of a clinical framework for mirror therapy (MT) in patients with phantom limb pain (PLP) following amputation. Based on an a priori formulated theoretical model, 3 sources of data collection were used to develop the clinical framework. First, a review of the literature took place on important clinical aspects and the evidence on the effectiveness of MT in patients with phantom limb pain. In addition, questionnaires and semi-structured interviews were used to analyze clinical experiences and preferences of physical and occupational therapists and patients suffering from PLP regarding the application of MT. All data were finally clustered into main and subcategories and were used to complement and refine the theoretical model. For every main category of the a priori formulated theoretical model, several subcategories emerged from the literature search, patient, and therapist interviews. Based on these categories, we developed a clinical flowchart that incorporates the main and subcategories in a logical way according to the phases in methodical intervention defined by the Royal Dutch Society for Physical Therapy. In addition, we developed a comprehensive booklet that illustrates the individual steps of the clinical flowchart. In this study, a structured clinical framework for the application of MT in patients with PLP was developed. This framework is currently being tested for its effectiveness in a multicenter randomized controlled trial. © 2015 World Institute of Pain.

  13. A HYPOTHESIS-DRIVEN FRAMEWORK FOR ASSESSING ...

    EPA Pesticide Factsheets

    Understanding how climate change will alter the availability of coastal final ecosystem goods and services (FEGS; such as food provisioning from fisheries, property protection, and recreation) has significant implications for coastal planning and the development of adaptive management strategies to maximize sustainability of natural resources. The dynamic social and physical settings of these important resources means that there is not a “one-size-fits-all” model to predict the specific changes in coastal FEGS that will occur as a result of climate change. Instead, we propose a hypothesis-driven approach that builds on available literature to understand the likely effects of climate change on FEGS across coastal regions of the United States. We present an analysis for three FEGS: food provisioning from fisheries, recreation, and property protection. Hypotheses were restricted to changes precipitated by four prominent climate stressors projected in coastal areas: 1) sea-level rise, 2) ocean acidification, 3) increased temperatures, and 4) intensification of coastal storms. Our approach identified links between these stressors and the ecological processes that produce the FEGS, with the capacity to incorporate regional differences in FEGS availability. Linkages were first presented in a logic model to conceptualize the framework. For each region, we developed hypotheses regarding the effects of climate stressors on FEGS by examining case studies For example, w

  14. Conceptual and logical level of database modeling

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2016-06-01

    Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.

  15. Ontology-Based Approach to Social Data Sentiment Analysis: Detection of Adolescent Depression Signals.

    PubMed

    Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min

    2017-07-24

    Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, "academic stresses" and "suicide" contributed negatively to the sentiment of adolescent depression. The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology. ©Hyesil Jung, Hyeoun-Ae Park, Tae-Min Song. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.07.2017.

  16. Ontology-Based Approach to Social Data Sentiment Analysis: Detection of Adolescent Depression Signals

    PubMed Central

    Jung, Hyesil; Song, Tae-Min

    2017-01-01

    Background Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. Objective The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. Methods The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. Results We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, “academic stresses” and “suicide” contributed negatively to the sentiment of adolescent depression. Conclusions The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology. PMID:28739560

  17. Compatible quantum theory

    NASA Astrophysics Data System (ADS)

    Friedberg, R.; Hohenberg, P. C.

    2014-09-01

    Formulations of quantum mechanics (QM) can be characterized as realistic, operationalist, or a combination of the two. In this paper a realistic theory is defined as describing a closed system entirely by means of entities and concepts pertaining to the system. An operationalist theory, on the other hand, requires in addition entities external to the system. A realistic formulation comprises an ontology, the set of (mathematical) entities that describe the system, and assertions, the set of correct statements (predictions) the theory makes about the objects in the ontology. Classical mechanics is the prime example of a realistic physical theory. A straightforward generalization of classical mechanics to QM is hampered by the inconsistency of quantum properties with classical logic, a circumstance that was noted many years ago by Birkhoff and von Neumann. The present realistic formulation of the histories approach originally introduced by Griffiths, which we call ‘compatible quantum theory (CQT)’, consists of a ‘microscopic’ part (MIQM), which applies to a closed quantum system of any size, and a ‘macroscopic’ part (MAQM), which requires the participation of a large (ideally, an infinite) system. The first (MIQM) can be fully formulated based solely on the assumption of a Hilbert space ontology and the noncontextuality of probability values, relying in an essential way on Gleason's theorem and on an application to dynamics due in large part to Nistico. Thus, the present formulation, in contrast to earlier ones, derives the Born probability formulas and the consistency (decoherence) conditions for frameworks. The microscopic theory does not, however, possess a unique corpus of assertions, but rather a multiplicity of contextual truths (‘c-truths’), each one associated with a different framework. This circumstance leads us to consider the microscopic theory to be physically indeterminate and therefore incomplete, though logically coherent. The completion of the theory requires a macroscopic mechanism for selecting a physical framework, which is part of the macroscopic theory (MAQM). The selection of a physical framework involves the breaking of the microscopic ‘framework symmetry’, which can proceed either phenomenologically as in the standard quantum measurement theory, or more fundamentally by considering the quantum system under study to be a subsystem of a macroscopic quantum system. The decoherent histories formulation of Gell-Mann and Hartle, as well as that of Omnès, are theories of this fundamental type, where the physical framework is selected by a coarse-graining procedure in which the physical phenomenon of decoherence plays an essential role. Various well-known interpretations of QM are described from the perspective of CQT. Detailed definitions and proofs are presented in the appendices.

  18. F-OWL: An Inference Engine for Semantic Web

    NASA Technical Reports Server (NTRS)

    Zou, Youyong; Finin, Tim; Chen, Harry

    2004-01-01

    Understanding and using the data and knowledge encoded in semantic web documents requires an inference engine. F-OWL is an inference engine for the semantic web language OWL language based on F-logic, an approach to defining frame-based systems in logic. F-OWL is implemented using XSB and Flora-2 and takes full advantage of their features. We describe how F-OWL computes ontology entailment and compare it with other description logic based approaches. We also describe TAGA, a trading agent environment that we have used as a test bed for F-OWL and to explore how multiagent systems can use semantic web concepts and technology.

  19. Federal Highway Administration research and technology evaluation final report : Eco-Logical

    DOT National Transportation Integrated Search

    2018-03-01

    This report documents an evaluation of Federal Highway Administrations (FHWA) Research and Technology Programs activities on the implementation of the Eco-Logical approach by State transportation departments and metropolitan planning organizati...

  20. Quantifying heterogeneity attributable to polythetic diagnostic criteria: theoretical framework and empirical application.

    PubMed

    Olbert, Charles M; Gala, Gary J; Tupler, Larry A

    2014-05-01

    Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.

  1. Research and Evaluations of the Health Aspects of Disasters, Part VI: Interventional Research and the Disaster Logic Model.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Kushner, Jennifer

    2016-04-01

    Disaster-related interventions are actions or responses undertaken during any phase of a disaster to change the current status of an affected community or a Societal System. Interventional disaster research aims to evaluate the results of such interventions in order to develop standards and best practices in Disaster Health that can be applied to disaster risk reduction. Considering interventions as production functions (transformation processes) structures the analyses and cataloguing of interventions/responses that are implemented prior to, during, or following a disaster or other emergency. Since currently it is not possible to do randomized, controlled studies of disasters, in order to validate the derived standards and best practices, the results of the studies must be compared and synthesized with results from other studies (ie, systematic reviews). Such reviews will be facilitated by the selected studies being structured using accepted frameworks. A logic model is a graphic representation of the transformation processes of a program [project] that shows the intended relationships between investments and results. Logic models are used to describe a program and its theory of change, and they provide a method for the analyzing and evaluating interventions. The Disaster Logic Model (DLM) is an adaptation of a logic model used for the evaluation of educational programs and provides the structure required for the analysis of disaster-related interventions. It incorporates a(n): definition of the current functional status of a community or Societal System, identification of needs, definition of goals, selection of objectives, implementation of the intervention(s), and evaluation of the effects, outcomes, costs, and impacts of the interventions. It is useful for determining the value of an intervention and it also provides the structure for analyzing the processes used in providing the intervention according to the Relief/Recovery and Risk-Reduction Frameworks.

  2. Binary logic based purely on Fresnel diffraction

    NASA Astrophysics Data System (ADS)

    Hamam, H.; de Bougrenet de La Tocnaye, J. L.

    1995-09-01

    Binary logic operations on two-dimensional data arrays are achieved by use of the self-imaging properties of Fresnel diffraction. The fields diffracted by periodic objects can be considered as the superimposition of weighted and shifted replicas of original objects. We show that a particular spatial organization of the input data can result in logical operations being performed on these data in the considered diffraction planes. Among various advantages, this approach is shown to allow the implementation of dual-track, nondissipative logical operators. Image algebra is presented as an experimental illustration of this principle.

  3. A qualitative study of participants' views on re-consent in a longitudinal biobank.

    PubMed

    Dixon-Woods, Mary; Kocman, David; Brewster, Liz; Willars, Janet; Laurie, Graeme; Tarrant, Carolyn

    2017-03-23

    Biomedical research increasingly relies on long-term studies involving use and re-use of biological samples and data stored in large repositories or "biobanks" over lengthy periods, often raising questions about whether and when a re-consenting process should be activated. We sought to investigate the views on re-consent of participants in a longitudinal biobank. We conducted a qualitative study involving interviews with 24 people who were participating in a longitudinal biobank. Their views were elicited using a semi-structured interview schedule and scenarios based on a hypothetical biobank. Data analysis was based on the constant comparative method. What participants identified as requiring new consent was not a straightforward matter predictable by algorithms about the scope of the consent, but instead was contingent. They assessed whether proposed new research implied a fundamental alteration in the underlying character of the biobank and whether specific projects were within the scope of the original consent. What mattered most to them was that the cooperative bargain into which they had entered was maintained in good faith. They saw re-consent as one important safeguard in this bargain. In determining what required re-consent, they deployed two logics. First, they used a logic of boundaries, where they sought to detect any possible rupture with their existing framework of cooperation. Second, they used a logic of risk, where they assessed proposed research for any potential threats for them personally or the research endeavour. When they judged that a need for re-consent had been activated, participants saw the process as way of re-actualising and renewing the cooperative bargain. Participants' perceptions of research as a process of mutual co-operation between volunteer and researcher were fundamental to their views on consent. Consenting arrangements for biobanks should respect the cooperative values that are important to participants, recognise the two logics used by research volunteers, and avoid rigidity. Agility may be favoured by tiered consent combined with strong oversight mechanisms; this approach requires evaluation.

  4. Supervised Learning in CINets

    DTIC Science & Technology

    2011-07-01

    supervised learning process is compared to that of Artificial Neural Network ( ANNs ), fuzzy logic rule set, and Bayesian network approaches...of both fuzzy logic systems and Artificial Neural Networks ( ANNs ). Like fuzzy logic systems, the CINet technique allows the use of human- intuitive...fuzzy rule systems [3] CINets also maintain features common to both fuzzy systems and ANNs . The technique can be be shown to possess the property

  5. Heat exchanger expert system logic

    NASA Technical Reports Server (NTRS)

    Cormier, R.

    1988-01-01

    The reduction is described of the operation and fault diagnostics of a Deep Space Network heat exchanger to a rule base by the application of propositional calculus to a set of logic statements. The value of this approach lies in the ease of converting the logic and subsequently implementing it on a computer as an expert system. The rule base was written in Process Intelligent Control software.

  6. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  7. Institutional logic in self-management support: coexistence and diversity.

    PubMed

    Bossy, Dagmara; Knutsen, Ingrid Ruud; Rogers, Anne; Foss, Christina

    2016-11-01

    The prevalence of chronic conditions in Europe has been the subject of health-political reforms that have increasingly targeted collaboration between public, private and voluntary organisations for the purpose of supporting self-management of long-term diseases. The international literature describes collaboration across sectors as challenging, which implies that their respective logics are conflicting or incompatible. In line with the European context, recent Norwegian health policy advocates inter-sectorial partnerships. The aim of this policy is to create networks supporting better self-management for people with chronic conditions. The purpose of our qualitative study was to map different understandings of self-management support in private for-profit, volunteer and public organisations. These organisations are seen as potential self-management support networks for individuals with chronic conditions in Norway. From December 2012 to April 2013, we conducted 50 semi-structured interviews with representatives from relevant health and well-being organisations in different parts of Norway. According to the theoretical framework of institutional logic, representatives' statements are embedded with organisational understandings. In the analysis, we systematically assessed the representatives' different understandings of self-management support. The institutional logic we identified revealed traits of organisational historical backgrounds, and transitions in understanding. We found that the merging of individualism and fellowship in contemporary health policy generates different types of logic in different organisational contexts. The private for-profit organisations were concerned with the logic of a healthy appearance and mindset, whereas the private non-profit organisations emphasised fellowship and moral responsibility. Finally, the public, illness-oriented organisations tended to highlight individual conditions for illness management. Different types of logic may attract different users, and simultaneously, a diversity of logic types may challenge collaboration at the user's expense. Moral implications embed institutional logic implying a change towards individual responsibility for disease. Policy makers ought to consider complexities of logic in order to tailor the different needs of users. © 2015 John Wiley & Sons Ltd.

  8. A psychometric evaluation of the digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-10-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.

  9. Deterministic seismic hazard macrozonation of India

    NASA Astrophysics Data System (ADS)

    Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.

    2012-10-01

    Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.

  10. Toward a phenomenology of trance logic in posttraumatic stress disorder.

    PubMed

    Beshai, J A

    2004-04-01

    Some induction procedures result in trance logic as an essential feature of hypnosis. Trance logic is a voluntary state of acceptance of suggestions without the critical evaluation that would destroy the validity of the meaningfulness of the suggestion. Induction procedures in real and simulated conditions induce a conflict between two contradictory messages in experimental hypnosis. In military induction the conflict is much more subtle involving society's need for security and its need for ethics. Such conflicts are often construed by the subject as trance logic. Trance logic provides an opportunity for therapists using the phenomenology of "presence" to deal with the objectified concepts of "avoidance," "numbing" implicit in this kind of dysfunctional thinking in Posttraumatic Stress Disorder. An individual phenomenology of induction procedures and suggestions, which trigger trance logic, may lead to a resolution of logical fallacies and recurring painful memories. It invites a reconciliation of conflicting messages implicit in phobias and avoidance traumas. Such a phenomenological analysis of trance logic may well be a novel approach to restructure the meaning of trauma.

  11. Think Pair Share Using Realistic Mathematics Education Approach in Geometry Learning

    NASA Astrophysics Data System (ADS)

    Afthina, H.; Mardiyana; Pramudya, I.

    2017-09-01

    This research aims to determine the impact of mathematics learning applying Think Pair Share (TPS) using Realistic Mathematics Education (RME) viewed from mathematical-logical intelligence in geometry learning. Method that used in this research is quasi experimental research The result of this research shows that (1) mathematics achievement applying TPS using RME approach gives a better result than those applying direct learning model; (2) students with high mathematical-logical intelligence can reach a better mathematics achievement than those with average and low one, whereas students with average mathematical-logical intelligence can reach a better achievement than those with low one; (3) there is no interaction between learning model and the level of students’ mathematical-logical intelligence in giving a mathematics achievement. The impact of this research is that TPS model using RME approach can be applied in mathematics learning so that students can learn more actively and understand the material more, and mathematics learning become more meaningful. On the other hand, internal factors of students must become a consideration toward the success of students’ mathematical achievement particularly in geometry material.

  12. The motor theory of speech perception revisited.

    PubMed

    Massaro, Dominic W; Chen, Trevor H

    2008-04-01

    Galantucci, Fowler, and Turvey (2006) have claimed that perceiving speech is perceiving gestures and that the motor system is recruited for perceiving speech. We make the counter argument that perceiving speech is not perceiving gestures, that the motor system is not recruitedfor perceiving speech, and that speech perception can be adequately described by a prototypical pattern recognition model, the fuzzy logical model of perception (FLMP). Empirical evidence taken as support for gesture and motor theory is reconsidered in more detail and in the framework of the FLMR Additional theoretical and logical arguments are made to challenge gesture and motor theory.

  13. A framework for qualitative reasoning about solid objects

    NASA Technical Reports Server (NTRS)

    Davis, E.

    1987-01-01

    Predicting the behavior of a qualitatively described system of solid objects requires a combination of geometrical, temporal, and physical reasoning. Methods based upon formulating and solving differential equations are not adequate for robust prediction, since the behavior of a system over extended time may be much simpler than its behavior over local time. A first-order logic, in which one can state simple physical problems and derive their solution deductively, without recourse to solving the differential equations, is discussed. This logic is substantially more expressive and powerful than any previous AI representational system in this domain.

  14. Evidence that logical reasoning depends on conscious processing.

    PubMed

    DeWall, C Nathan; Baumeister, Roy F; Masicampo, E J

    2008-09-01

    Humans, unlike other animals, are equipped with a powerful brain that permits conscious awareness and reflection. A growing trend in psychological science has questioned the benefits of consciousness, however. Testing a hypothesis advanced by [Lieberman, M. D., Gaunt, R., Gilbert, D. T., & Trope, Y. (2002). Reflection and reflexion: A social cognitive neuroscience approach to attributional inference. Advances in Experimental Social Psychology, 34, 199-249], four studies suggested that the conscious, reflective processing system is vital for logical reasoning. Substantial decrements in logical reasoning were found when a cognitive load manipulation preoccupied conscious processing, while hampering the nonconscious system with consciously suppressed thoughts failed to impair reasoning (Experiment 1). Nonconscious activation (priming) of the idea of logical reasoning increased the activation of logic-relevant concepts, but failed to improve logical reasoning performance (Experiments 2a-2c) unless the logical conclusions were largely intuitive and thus not reliant on logical reasoning (Experiment 3). Meanwhile, stimulating the conscious goal of reasoning well led to improvements in reasoning performance (Experiment 4). These findings offer evidence that logical reasoning is aided by the conscious, reflective processing system.

  15. Fuzzy logic in autonomous orbital operations

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Jani, Yashvant

    1991-01-01

    Fuzzy logic can be used advantageously in autonomous orbital operations that require the capability of handling imprecise measurements from sensors. Several applications are underway to investigate fuzzy logic approaches and develop guidance and control algorithms for autonomous orbital operations. Translational as well as rotational control of a spacecraft have been demonstrated using space shuttle simulations. An approach to a camera tracking system has been developed to support proximity operations and traffic management around the Space Station Freedom. Pattern recognition and object identification algorithms currently under development will become part of this camera system at an appropriate level in the future. A concept to control environment and life support systems for large Lunar based crew quarters is also under development. Investigations in the area of reinforcement learning, utilizing neural networks, combined with a fuzzy logic controller, are planned as a joint project with the Ames Research Center.

  16. Computational logic: its origins and applications.

    PubMed

    Paulson, Lawrence C

    2018-02-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.

  17. UML activity diagrams in requirements specification of logic controllers

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał

    2015-12-01

    Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.

  18. Paraconsistent Annotated Logic in Viability Analysis: an Approach to Product Launching

    NASA Astrophysics Data System (ADS)

    Romeu de Carvalho, Fábio; Brunstein, Israel; Abe, Jair Minoro

    2004-08-01

    In this paper we present an application of the Para-analyzer, a logical analyzer based on the Paraconsistent Annotated Logic Pτ, introduced by Da Silva Filho and Abe in the decision-making systems. An example is analyzed in detail showing how uncertainty, inconsistency and paracompleteness can be elegantly handled with this logical system. As application for the Para-analyzer in decision-making, we developed the BAM — Baricenter Analysis Method. In order to make the presentation easier, we present the BAM applied in the viability analysis of product launching. Some of the techniques of Paraconsistent Annotated Logic have been applied in Artificial Intelligence, Robotics, Information Technolgy (Computer Sciences), etc..

  19. Methods for identifying SNP interactions: a review on variations of Logic Regression, Random Forest and Bayesian logistic regression.

    PubMed

    Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula

    2011-01-01

    Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.

  20. Applications of fuzzy logic to control and decision making

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Jani, Yashvant

    1991-01-01

    Long range space missions will require high operational efficiency as well as autonomy to enhance the effectivity of performance. Fuzzy logic technology has been shown to be powerful and robust in interpreting imprecise measurements and generating appropriate control decisions for many space operations. Several applications are underway, studying the fuzzy logic approach to solving control and decision making problems. Fuzzy logic algorithms for relative motion and attitude control have been developed and demonstrated for proximity operations. Based on this experience, motion control algorithms that include obstacle avoidance were developed for a Mars Rover prototype for maneuvering during the sample collection process. A concept of an intelligent sensor system that can identify objects and track them continuously and learn from its environment is under development to support traffic management and proximity operations around the Space Station Freedom. For safe and reliable operation of Lunar/Mars based crew quarters, high speed controllers with ability to combine imprecise measurements from several sensors is required. A fuzzy logic approach that uses high speed fuzzy hardware chips is being studied.

  1. From complexity to reality: providing useful frameworks for defining systems of care.

    PubMed

    Levison-Johnson, Jody; Wenz-Gross, Melodie

    2010-02-01

    Because systems of care are not uniform across communities, there is a need to better document the process of system development, define the complexity, and describe the development of the structures, processes, and relationships within communities engaged in system transformation. By doing so, we begin to identify the necessary and sufficient components that, at minimum, move us from usual care within a naturally occurring system to a true system of care. Further, by documenting and measuring the degree to which key components are operating, we may be able to identify the most successful strategies in creating system reform. The theory of change and logic model offer a useful framework for communities to begin the adaptive work necessary to effect true transformation. Using the experience of two system of care communities, this new definition and the utility of a theory of change and logic model framework for defining local system transformation efforts will be discussed. Implications for the field, including the need to further examine the natural progression of systems change and to create quantifiable measures of transformation, will be raised as new challenges for the evolving system of care movement.

  2. Program logic: a framework for health program design and evaluation - the Pap nurse in general practice program.

    PubMed

    Hallinan, Christine M

    2010-01-01

    In this paper, program logic will be used to 'map out' the planning, development and evaluation of the general practice Pap nurse program in the Australian general practice arena. The incorporation of program logic into the evaluative process supports a greater appreciation of the theoretical assumptions and external influences that underpin general practice Pap nurse activity. The creation of a program logic model is a conscious strategy that results an explicit understanding of the challenges ahead, the resources available and time frames for outcomes. Program logic also enables a recognition that all players in the general practice arena need to be acknowledged by policy makers, bureaucrats and program designers when addressing through policy, issues relating to equity and accessibility of health initiatives. Logic modelling allows decision makers to consider the complexities of causal associations when developing health care proposals and programs. It enables the Pap nurse in general practice program to be represented diagrammatically by linking outcomes (short, medium and long term) with both the program activities and program assumptions. The research methodology used in the evaluation of the Pap nurse in general practice program includes a descriptive study design and the incorporation of program logic, with a retrospective analysis of Australian data from 2001 to 2009. For the purposes of gaining both empirical and contextual data for this paper, a data set analysis and literature review was performed. The application of program logic as an evaluative tool for analysis of the Pap PN incentive program facilitates a greater understanding of complex general practice activity triggers, and also allows this greater understanding to be incorporated into policy to facilitate Pap PN activity, increase general practice cervical smear and ultimately decrease burden of disease.

  3. Constraint Logic Programming approach to protein structure prediction.

    PubMed

    Dal Palù, Alessandro; Dovier, Agostino; Fogolari, Federico

    2004-11-30

    The protein structure prediction problem is one of the most challenging problems in biological sciences. Many approaches have been proposed using database information and/or simplified protein models. The protein structure prediction problem can be cast in the form of an optimization problem. Notwithstanding its importance, the problem has very seldom been tackled by Constraint Logic Programming, a declarative programming paradigm suitable for solving combinatorial optimization problems. Constraint Logic Programming techniques have been applied to the protein structure prediction problem on the face-centered cube lattice model. Molecular dynamics techniques, endowed with the notion of constraint, have been also exploited. Even using a very simplified model, Constraint Logic Programming on the face-centered cube lattice model allowed us to obtain acceptable results for a few small proteins. As a test implementation their (known) secondary structure and the presence of disulfide bridges are used as constraints. Simplified structures obtained in this way have been converted to all atom models with plausible structure. Results have been compared with a similar approach using a well-established technique as molecular dynamics. The results obtained on small proteins show that Constraint Logic Programming techniques can be employed for studying protein simplified models, which can be converted into realistic all atom models. The advantage of Constraint Logic Programming over other, much more explored, methodologies, resides in the rapid software prototyping, in the easy way of encoding heuristics, and in exploiting all the advances made in this research area, e.g. in constraint propagation and its use for pruning the huge search space.

  4. An Expert System Framework for Adaptive Evidential Reasoning: Application to In-Flight Route Re-Planning

    DTIC Science & Technology

    1986-03-21

    i t a t i v e frameworks (e.g., Doyle, Toulmin , P . Cohen), and e f f o r t s t o syn thes i ze l o g i c and p r o b a b i l i t y (Nilsson...logic allows for provisional acceptance of uncer- tain premises, which may later be retracted when they lead to contradictory conclusions. Toulmin (1958...A1 researchers] have accepted without hesitation as impeccable." * The basic framework of an argument, according to Toulmin , is as follows ( Toulmin

  5. Personal Epistemology of Urban Elementary School Teachers

    ERIC Educational Resources Information Center

    Pearrow, Melissa; Sanchez, William

    2008-01-01

    Personal epistemology, originating from social construction theory, provides a framework for researchers to understand how individuals view their world. The Attitudes About Reality (AAR) scale is one survey method that qualitatively assesses personal epistemology along the logical positivist and social constructionist continuum; however, the…

  6. Studies in optical parallel processing. [All optical and electro-optic approaches

    NASA Technical Reports Server (NTRS)

    Lee, S. H.

    1978-01-01

    Threshold and A/D devices for converting a gray scale image into a binary one were investigated for all-optical and opto-electronic approaches to parallel processing. Integrated optical logic circuits (IOC) and optical parallel logic devices (OPA) were studied as an approach to processing optical binary signals. In the IOC logic scheme, a single row of an optical image is coupled into the IOC substrate at a time through an array of optical fibers. Parallel processing is carried out out, on each image element of these rows, in the IOC substrate and the resulting output exits via a second array of optical fibers. The OPAL system for parallel processing which uses a Fabry-Perot interferometer for image thresholding and analog-to-digital conversion, achieves a higher degree of parallel processing than is possible with IOC.

  7. Two autowire versions for CDC-3200 and IBM-360

    NASA Technical Reports Server (NTRS)

    Billingsley, J. B.

    1972-01-01

    Microelectronics program was initiated to evaluate circuitry, packaging methods, and fabrication approaches necessary to produce completely procured logic system. Two autowire programs were developed for CDC-3200 and IBM-360 computers for use in designing logic systems.

  8. A Fuzzy Logic Optimal Control Law Solution to the CMMCA Tracking Problem

    DTIC Science & Technology

    1993-03-01

    or from a transfer function. Many times, however, the resulting algorithms are so complex as to be completely or essentially useless. Applications...implemented in a nearly real time computer simulation. Located within the LQ framework are all the performance data for both the ClMCA and the CX...repuired nor desired. 34 - / k more general and less exacting framework was used. In order to concentrate on tho theory and problem solution, it was

  9. Understanding the dynamic effects of returning patients toward emergency department density

    NASA Astrophysics Data System (ADS)

    Ahmad, Norazura; Zulkepli, Jafri; Ramli, Razamin; Ghani, Noraida Abdul; Teo, Aik Howe

    2017-11-01

    This paper presents the development of a dynamic hypothesis for the effect of returning patients to the emergency department (ED). A logical tree from the Theory of Constraint known as Current Reality Tree was used to identify the key variables. Then, a hypothetical framework portraying the interrelated variables and its influencing relationships was developed using causal loop diagrams (CLD). The conceptual framework was designed as the basis for the development of a system dynamics model.

  10. An intelligent tutoring system for space shuttle diagnosis

    NASA Technical Reports Server (NTRS)

    Johnson, William B.; Norton, Jeffrey E.; Duncan, Phillip C.

    1988-01-01

    An Intelligent Tutoring System (ITS) transcends conventional computer-based instruction. An ITS is capable of monitoring and understanding student performance thereby providing feedback, explanation, and remediation. This is accomplished by including models of the student, the instructor, and the expert technician or operator in the domain of interest. The space shuttle fuel cell is the technical domain for the project described below. One system, Microcomputer Intelligence for Technical Training (MITT), demonstrates that ITS's can be developed and delivered, with a reasonable amount of effort and in a short period of time, on a microcomputer. The MITT system capitalizes on the diagnostic training approach called Framework for Aiding the Understanding of Logical Troubleshooting (FAULT) (Johnson, 1987). The system's embedded procedural expert was developed with NASA's C-Language Integrated Production (CLIP) expert system shell (Cubert, 1987).

  11. Variable steroid receptor responses: Intrinsically disordered AF1 is the key

    PubMed Central

    Simons, S. Stoney; Kumar, Raj

    2013-01-01

    Steroid hormones, acting through their cognate receptor proteins, see widespread clinical applications due to their ability to alter the induction or repression of numerous genes. However, steroid usage is limited by the current inability to control off-target, or non-specific, side-effects. Recent results from three separate areas of research with glucocorticoid and other steroid receptors (cofactor-induced changes in receptor structure, the ability of ligands to alter remote regions of receptor structure, and how cofactor concentration affects both ligand potency and efficacy) indicate that a key element of receptor activity is the intrinsically disordered amino-terminal domain. These results are combined to construct a novel framework within which to logically pursue various approaches that could afford increased selectivity in steroid-based therapies. PMID:23792173

  12. A Pilot Study on Modeling of Diagnostic Criteria Using OWL and SWRL.

    PubMed

    Hong, Na; Jiang, Guoqian; Pathak, Jyotishiman; Chute, Christopher G

    2015-01-01

    The objective of this study is to describe our efforts in a pilot study on modeling diagnostic criteria using a Semantic Web-based approach. We reused the basic framework of the ICD-11 content model and refined it into an operational model in the Web Ontology Language (OWL). The refinement is based on a bottom-up analysis method, in which we analyzed data elements (including value sets) in a collection (n=20) of randomly selected diagnostic criteria. We also performed a case study to formalize rule logic in the diagnostic criteria of metabolic syndrome using the Semantic Web Rule Language (SWRL). The results demonstrated that it is feasible to use OWL and SWRL to formalize the diagnostic criteria knowledge, and to execute the rules through reasoning.

  13. Adaptive Fuzzy Output Constrained Control Design for Multi-Input Multioutput Stochastic Nonstrict-Feedback Nonlinear Systems.

    PubMed

    Li, Yongming; Tong, Shaocheng

    2017-12-01

    In this paper, an adaptive fuzzy output constrained control design approach is addressed for multi-input multioutput uncertain stochastic nonlinear systems in nonstrict-feedback form. The nonlinear systems addressed in this paper possess unstructured uncertainties, unknown gain functions and unknown stochastic disturbances. Fuzzy logic systems are utilized to tackle the problem of unknown nonlinear uncertainties. The barrier Lyapunov function technique is employed to solve the output constrained problem. In the framework of backstepping design, an adaptive fuzzy control design scheme is constructed. All the signals in the closed-loop system are proved to be bounded in probability and the system outputs are constrained in a given compact set. Finally, the applicability of the proposed controller is well carried out by a simulation example.

  14. A review of causal inference for biomedical informatics

    PubMed Central

    Kleinberg, Samantha; Hripcsak, George

    2011-01-01

    Causality is an important concept throughout the health sciences and is particularly vital for informatics work such as finding adverse drug events or risk factors for disease using electronic health records. While philosophers and scientists working for centuries on formalizing what makes something a cause have not reached a consensus, new methods for inference show that we can make progress in this area in many practical cases. This article reviews core concepts in understanding and identifying causality and then reviews current computational methods for inference and explanation, focusing on inference from large-scale observational data. While the problem is not fully solved, we show that graphical models and Granger causality provide useful frameworks for inference and that a more recent approach based on temporal logic addresses some of the limitations of these methods. PMID:21782035

  15. Extending XNAT Platform with an Incremental Semantic Framework

    PubMed Central

    Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael

    2017-01-01

    Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709

  16. Extending XNAT Platform with an Incremental Semantic Framework.

    PubMed

    Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael

    2017-01-01

    Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.

  17. Questions for Assessing Higher-Order Cognitive Skills: It's Not Just Bloom’s

    PubMed Central

    Lemons, Paula P.; Lemons, J. Derrick

    2013-01-01

    We present an exploratory study of biologists’ ideas about higher-order cognition questions. We documented the conversations of biologists who were writing and reviewing a set of higher-order cognition questions. Using a qualitative approach, we identified the themes of these conversations. Biologists in our study used Bloom's Taxonomy to logically analyze questions. However, biologists were also concerned with question difficulty, the length of time required for students to address questions, and students’ experience with questions. Finally, some biologists demonstrated an assumption that questions should have one correct answer, not multiple reasonable solutions; this assumption undermined their comfort with some higher-order cognition questions. We generated a framework for further research that provides an interpretation of participants’ ideas about higher-order questions and a model of the relationships among these ideas. Two hypotheses emerge from this framework. First, we propose that biologists look for ways to measure difficulty when writing higher-order questions. Second, we propose that biologists’ assumptions about the role of questions in student learning strongly influence the types of higher-order questions they write. PMID:23463228

  18. Automation of 3D reconstruction of neural tissue from large volume of conventional serial section transmission electron micrographs.

    PubMed

    Mishchenko, Yuriy

    2009-01-30

    We describe an approach for automation of the process of reconstruction of neural tissue from serial section transmission electron micrographs. Such reconstructions require 3D segmentation of individual neuronal processes (axons and dendrites) performed in densely packed neuropil. We first detect neuronal cell profiles in each image in a stack of serial micrographs with multi-scale ridge detector. Short breaks in detected boundaries are interpolated using anisotropic contour completion formulated in fuzzy-logic framework. Detected profiles from adjacent sections are linked together based on cues such as shape similarity and image texture. Thus obtained 3D segmentation is validated by human operators in computer-guided proofreading process. Our approach makes possible reconstructions of neural tissue at final rate of about 5 microm3/manh, as determined primarily by the speed of proofreading. To date we have applied this approach to reconstruct few blocks of neural tissue from different regions of rat brain totaling over 1000microm3, and used these to evaluate reconstruction speed, quality, error rates, and presence of ambiguous locations in neuropil ssTEM imaging data.

  19. Creating an Environmental Justice Framework for Policy Change in Childhood Asthma: A Grassroots to Treetops Approach

    PubMed Central

    Sargent, Katherine; Arons, Abigail; Standish, Marion; Brindis, Claire D.

    2011-01-01

    Objectives. The Community Action to Fight Asthma Initiative, a network of coalitions and technical assistance providers in California, employed an environmental justice approach to reduce risk factors for asthma in school-aged children. Policy advocacy focused on housing, schools, and outdoor air quality. Technical assistance partners from environmental science, policy advocacy, asthma prevention, and media assisted in advocacy. An evaluation team assessed progress and outcomes. Methods. A theory of change and corresponding logic model were used to document coalition development and successes. Site visits, surveys, policymaker interviews, and participation in meetings documented the processes and outcomes. Quantitative and qualitative data were analyzed to assess strategies, successes, and challenges. Results. Coalitions, working with community residents and technical assistance experts, successfully advocated for policies to reduce children's exposures to environmental triggers, particularly in low-income communities and communities of color. Policies were implemented at various levels. Conclusions. Environmental justice approaches to policy advocacy could be an effective strategy to address inequities across communities. Strong technical assistance, close community involvement, and multilevel strategies were all essential to effective policies to reduce environmental inequities. PMID:21836108

  20. Quantum Weak Values and Logic: An Uneasy Couple

    NASA Astrophysics Data System (ADS)

    Svensson, Bengt E. Y.

    2017-03-01

    Quantum mechanical weak values of projection operators have been used to answer which-way questions, e. g. to trace which arms in a multiple Mach-Zehnder setup a particle may have traversed from a given initial to a prescribed final state. I show that this procedure might lead to logical inconsistencies in the sense that different methods used to answer composite questions, like "Has the particle traversed the way X or the way Y?", may result in different answers depending on which methods are used to find the answer. I illustrate the problem by considering some examples: the "quantum pigeonhole" framework of Aharonov et al., the three-box problem, and Hardy's paradox. To prepare the ground for my main conclusion on the incompatibility in certain cases of weak values and logic, I study the corresponding situation for strong/projective measurements. In this case, no logical inconsistencies occur provided one is always careful in specifying exactly to which ensemble or sample space one refers. My results cast doubts on the utility of quantum weak values in treating cases like the examples mentioned.

  1. MOE vs. M&E: considering the difference between measuring strategic effectiveness and monitoring tactical evaluation.

    PubMed

    Diehl, Glen; Major, Solomon

    2015-01-01

    Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  2. Psycho-logic: some thoughts and after-thoughts.

    PubMed

    Smedslund, J

    2012-08-01

    The main features of the system of psycho-logic and its historical origins, especially in the writings of Heider and Piaget, are briefly reviewed. An updated version of the axioms of psycho-logic, and a list of the semantic primitives of Wierzbicka are presented. Some foundational questions are discussed, including the genetically determined limitations of human knowledge, the constructive, moral, and political nature of the approach, the role of fortuitous events, the ultimate limitations of psychological knowledge (the "balloon" to be inflated from the inside), the role of the subjective unconscious, and the implications of the approach for practice. © 2012 The Author. Scandinavian Journal of Psychology © 2012 The Scandinavian Psychological Associations.

  3. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  4. Cumulative biological impacts framework for solar energy projects in the California Desert

    USGS Publications Warehouse

    Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John

    2013-01-01

    This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.

  5. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  6. Education in the workplace for the physician: clinical management states as an organizing framework.

    PubMed

    Greenes, R A

    2000-01-01

    Medical educators are interested in approaches to making selected relevant knowledge available in the context of problem-based care. This is of value both during the process of care and as a means of organizing information for offline self-study. Four trends in health information technology are relevant to achieving the goal and can be expected to play a growing role in the future. First, health care enterprises are developing approaches for access to information resources related to the care of a patient, including clinical data and images but also communication tools, referral and other logistic tools, decision support, and educational materials. Second, information for patients and methods for patient-doctor interaction and decision making are becoming available. Third, computer-based methods for representation of practice guidelines are being developed to support applications that can incorporate their logic. Finally, considering patients as being in particular "clinical management states" (or CMSs) for specific problems, approaches are being developed to use guidelines as a kind of "predictive" framework to enable development of interfaces for problem-based clinical encounters. The guidelines for a CMS can be used to identify the kinds of resources specifically needed for clinical encounters of that type. As the above trends converge to produce problem-specific environments, professional specialty organizations and continuing medical education course designers will need to focus energies on organizing and updating medical knowledge to make it available in CMS-specific contexts.

  7. A multimodal physical therapy approach utilizing the Maitland concept in the management of a patient with cervical and lumbar radiculitis and Ehlers-Danlos syndrome-hypermobility type: A case report.

    PubMed

    Pennetti, Adelina

    2018-07-01

    The purpose of this case report is to present a multimodal approach for patient management using the Maitland concept framework for cervical and lumbar radiculitis with an underlying diagnosis of Ehlers-Danlos Syndrome-Hypermobility Type (EDS-HT). This case presents care guided by evidence, patient values, and rationale for the selected course of physical therapy treatment provided by therapist experience. A 35-year-old female with a 2-year history of worsening lumbar and cervical pain was referred to physical therapy to address these musculoskeletal issues concurrent with diagnostic testing for EDS. A multimodal approach including manual therapy, therapeutic exercise, postural and body mechanics education, and a home exercise program was used. The patient specific functional scale (PSFS) was used to gauge patient's perceived improvements which were demonstrated by increased scores at reevaluation and at discharge. Following the Maitland concept framework, the physical therapist was able to make sound clinical decisions by tracking the logical flow of constant patient assessment. A 10-month course of treatment designed to maximize recovery of function was successful with a chronic history of pain and the EDS-HT diagnosis. The role of education and empowering the patient is shown to be of utmost importance. Optimizing therapeutic outcomes long-term for this patient population requires maintaining a home exercise program, adaptation and modifications of work and lifestyle activities.

  8. The mathematics of a quantum Hamiltonian computing half adder Boolean logic gate.

    PubMed

    Dridi, G; Julien, R; Hliwa, M; Joachim, C

    2015-08-28

    The mathematics behind the quantum Hamiltonian computing (QHC) approach of designing Boolean logic gates with a quantum system are given. Using the quantum eigenvalue repulsion effect, the QHC AND, NAND, OR, NOR, XOR, and NXOR Hamiltonian Boolean matrices are constructed. This is applied to the construction of a QHC half adder Hamiltonian matrix requiring only six quantum states to fullfil a half Boolean logical truth table. The QHC design rules open a nano-architectronic way of constructing Boolean logic gates inside a single molecule or atom by atom at the surface of a passivated semi-conductor.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Raedt, Hans; Katsnelson, Mikhail I.; Donker, Hylke C.

    It is shown that the Pauli equation and the concept of spin naturally emerge from logical inference applied to experiments on a charged particle under the conditions that (i) space is homogeneous (ii) the observed events are logically independent, and (iii) the observed frequency distributions are robust with respect to small changes in the conditions under which the experiment is carried out. The derivation does not take recourse to concepts of quantum theory and is based on the same principles which have already been shown to lead to e.g. the Schrödinger equation and the probability distributions of pairs of particles inmore » the singlet or triplet state. Application to Stern–Gerlach experiments with chargeless, magnetic particles, provides additional support for the thesis that quantum theory follows from logical inference applied to a well-defined class of experiments. - Highlights: • The Pauli equation is obtained through logical inference applied to robust experiments on a charged particle. • The concept of spin appears as an inference resulting from the treatment of two-valued data. • The same reasoning yields the quantum theoretical description of neutral magnetic particles. • Logical inference provides a framework to establish a bridge between objective knowledge gathered through experiments and their description in terms of concepts.« less

  10. Quantum effects in the understanding of consciousness.

    PubMed

    Hameroff, Stuart R; Craddock, Travis J A; Tuszynski, Jack A

    2014-06-01

    This paper presents a historical perspective on the development and application of quantum physics methodology beyond physics, especially in biology and in the area of consciousness studies. Quantum physics provides a conceptual framework for the structural aspects of biological systems and processes via quantum chemistry. In recent years individual biological phenomena such as photosynthesis and bird navigation have been experimentally and theoretically analyzed using quantum methods building conceptual foundations for quantum biology. Since consciousness is attributed to human (and possibly animal) mind, quantum underpinnings of cognitive processes are a logical extension. Several proposals, especially the Orch OR hypothesis, have been put forth in an effort to introduce a scientific basis to the theory of consciousness. At the center of these approaches are microtubules as the substrate on which conscious processes in terms of quantum coherence and entanglement can be built. Additionally, Quantum Metabolism, quantum processes in ion channels and quantum effects in sensory stimulation are discussed in this connection. We discuss the challenges and merits related to quantum consciousness approaches as well as their potential extensions.

  11. Experimental Studies of Quasi-Adiabatic Quantum-dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Orlov, Alexei; Amlani, Islamshah; Kummamuru, Ravi; Toth, Geza; Bernstein, Gary; Lent, Craig; Snider, Gregory

    2000-03-01

    The computational approach known as Quantum-dot Cellular Automata (QCA) uses interacting quantum dots to encode and process binary information. The first realization of a functioning QCA cell has already been reported. Recently, quasi-adiabatic switching of QCA in a metal dot system near the instantaneous ground state was proposed [1]. The advantage if this approach is that it allows both logic and addressable memory to be implemented within the QCA framework. We report on the fabrication and measurement of such a device in the Al-AlOx tunnel junction system. This basic building block consists of three metal islands connected in series by tunnel junctions, where an electron can be moved between islands by means of electrostatic perturbation on either control electrodes or adjacent cells. The cell can have three operational modes, i.e. active, locked and null, which provide a solution for ground state computing that is not susceptible to metastable states. [1] G. Toth and C. S. Lent, J. appl. Phys. 85 5, 2977-2984, 1999.

  12. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  13. A logic-based method for integer programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hooker, J.; Natraj, N.R.

    1994-12-31

    We propose a logic-based approach to integer programming that replaces traditional branch-and-cut techniques with logical analogs. Integer variables are regarded as atomic propositions. The constraints give rise to logical formulas that are analogous to separating cuts. No continuous relaxation is used. Rather, the cuts are selected so that they can be easily solved as a discrete relaxation. (In fact, defining a relaxation and generating cuts are best seen as the same problem.) We experiment with relaxations that have a k-tree structure and can be solved by nonserial dynamic programming. We also present logic-based analogs of facet-defining cuts, Chv{acute a}tal rank,more » etc. We conclude with some preliminary computational results.« less

  14. Topological Properties of Some Integrated Circuits for Very Large Scale Integration Chip Designs

    NASA Astrophysics Data System (ADS)

    Swanson, S.; Lanzerotti, M.; Vernizzi, G.; Kujawski, J.; Weatherwax, A.

    2015-03-01

    This talk presents topological properties of integrated circuits for Very Large Scale Integration chip designs. These circuits can be implemented in very large scale integrated circuits, such as those in high performance microprocessors. Prior work considered basic combinational logic functions and produced a mathematical framework based on algebraic topology for integrated circuits composed of logic gates. Prior work also produced an historically-equivalent interpretation of Mr. E. F. Rent's work for today's complex circuitry in modern high performance microprocessors, where a heuristic linear relationship was observed between the number of connections and number of logic gates. This talk will examine topological properties and connectivity of more complex functionally-equivalent integrated circuits. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense or the U.S. Government.

  15. Language, procedures, and the non-perceptual origin of number word meanings.

    PubMed

    Barner, David

    2017-05-01

    Perceptual representations of objects and approximate magnitudes are often invoked as building blocks that children combine to acquire the positive integers. Systems of numerical perception are either assumed to contain the logical foundations of arithmetic innately, or to supply the basis for their induction. I propose an alternative to this framework, and argue that the integers are not learned from perceptual systems, but arise to explain perception. Using cross-linguistic and developmental data, I show that small (~1-4) and large (~5+) numbers arise both historically and in individual children via distinct mechanisms, constituting independent learning problems, neither of which begins with perceptual building blocks. Children first learn small numbers using the same logic that supports other linguistic number marking (e.g. singular/plural). Years later, they infer the logic of counting from the relations between large number words and their roles in blind counting procedures, only incidentally associating number words with approximate magnitudes.

  16. Assessment of Seismic Damage on The Exist Buildings Using Fuzzy Logic

    NASA Astrophysics Data System (ADS)

    Pınar, USTA; Nihat, MOROVA; EVCİ, Ahmet; ERGÜN, Serap

    2018-01-01

    Earthquake as a natural disaster could damage the lives of many people and buildings all over the world. These is micvulnerability of the buildings needs to be evaluated. Accurate evaluation of damage sustained by buildings during natural disaster events is critical to determine the buildings safety and their suitability for future occupancy. The earthquake is one of the disasters that structures face the most. There fore, there is a need to evaluate seismic damage and vulnerability of the buildings to protect them. These days fuzzy systems have been widely used in different fields of science because of its simpli city and efficiency. Fuzzy logic provides a suitable framework for reasoning, deduction, and decision making in fuzzy conditions. In this paper, studies on earthquake hazard evaluation of buildings by fuzzy logic modeling concepts in the literature have been investigated and evaluated, as a whole.

  17. Active matter logic for autonomous microfluidics

    NASA Astrophysics Data System (ADS)

    Woodhouse, Francis G.; Dunkel, Jörn

    2017-04-01

    Chemically or optically powered active matter plays an increasingly important role in materials design, but its computational potential has yet to be explored systematically. The competition between energy consumption and dissipation imposes stringent physical constraints on the information transport in active flow networks, facilitating global optimization strategies that are not well understood. Here, we combine insights from recent microbial experiments with concepts from lattice-field theory and non-equilibrium statistical mechanics to introduce a generic theoretical framework for active matter logic. Highlighting conceptual differences with classical and quantum computation, we demonstrate how the inherent non-locality of incompressible active flow networks can be utilized to construct universal logical operations, Fredkin gates and memory storage in set-reset latches through the synchronized self-organization of many individual network components. Our work lays the conceptual foundation for developing autonomous microfluidic transport devices driven by bacterial fluids, active liquid crystals or chemically engineered motile colloids.

  18. Competing Logics and Healthcare

    PubMed Central

    Saks, Mike

    2018-01-01

    This paper offers a short commentary on the editorial by Mannion and Exworthy. The paper highlights the positive insights offered by their analysis into the tensions between the competing institutional logics of standardization and customization in healthcare, in part manifested in the conflict between managers and professionals, and endorses the plea of the authors for further research in this field. However, the editorial is criticized for its lack of a strong societal reference point, the comparative absence of focus on hybridization, and its failure to highlight structural factors impinging on the opposing logics in a broader neo-institutional framework. With reference to the Procrustean metaphor, it is argued that greater stress should be placed on the healthcare user in future health policy. Finally, the case of complementary and alternative medicine is set out which – while not explicitly mentioned in the editorial – most effectively concretizes the tensions at the heart of this analysis of healthcare. PMID:29626406

  19. Community science, philosophy of science, and the practice of research.

    PubMed

    Tebes, Jacob Kraemer

    2005-06-01

    Embedded in community science are implicit theories on the nature of reality (ontology), the justification of knowledge claims (epistemology), and how knowledge is constructed (methodology). These implicit theories influence the conceptualization and practice of research, and open up or constrain its possibilities. The purpose of this paper is to make some of these theories explicit, trace their intellectual history, and propose a shift in the way research in the social and behavioral sciences, and community science in particular, is conceptualized and practiced. After describing the influence and decline of logical empiricism, the underlying philosophical framework for science for the past century, I summarize contemporary views in the philosophy of science that are alternatives to logical empiricism. These include contextualism, normative naturalism, and scientific realism, and propose that a modified version of contextualism, known as perspectivism, affords the philosophical framework for an emerging community science. I then discuss the implications of perspectivism for community science in the form of four propositions to guide the practice of research.

  20. New mode switching algorithm for the JPL 70-meter antenna servo controller

    NASA Technical Reports Server (NTRS)

    Nickerson, J. A.

    1988-01-01

    The design of control mode switching algorithms and logic for JPL's 70 m antenna servo controller are described. The old control mode switching logic was reviewed and perturbation problems were identified. Design approaches for mode switching are presented and the final design is described. Simulations used to compare old and new mode switching algorithms and logic show that the new mode switching techniques will significantly reduce perturbation problems.

  1. An Adaptive Fuzzy-Logic Traffic Control System in Conditions of Saturated Transport Stream

    PubMed Central

    Marakhimov, A. R.; Igamberdiev, H. Z.; Umarov, Sh. X.

    2016-01-01

    This paper considers the problem of building adaptive fuzzy-logic traffic control systems (AFLTCS) to deal with information fuzziness and uncertainty in case of heavy traffic streams. Methods of formal description of traffic control on the crossroads based on fuzzy sets and fuzzy logic are proposed. This paper also provides efficient algorithms for implementing AFLTCS and develops the appropriate simulation models to test the efficiency of suggested approach. PMID:27517081

  2. Application of Fuzzy Logic to Matrix FMECA

    NASA Astrophysics Data System (ADS)

    Shankar, N. Ravi; Prabhu, B. S.

    2001-04-01

    A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.

  3. LOGIC OF CONTROLLED THRESHOLD DEVICES.

    DTIC Science & Technology

    The synthesis of threshold logic circuits from several points of view is presented. The first approach is applicable to resistor-transistor networks...in which the outputs are tied to a common collector resistor. In general, fewer threshold logic gates than NOR gates connected to a common collector...network to realize a specified function such that the failure of any but the output gate can be compensated for by a change in the threshold level (and

  4. The Quantum Logical Challenge: Peter Mittelstaedt's Contributions to Logic and Philosophy of Science

    NASA Astrophysics Data System (ADS)

    Beltrametti, E.; Dalla Chiara, M. L.; Giuntini, R.

    2017-12-01

    Peter Mittelstaedt's contributions to quantum logic and to the foundational problems of quantum theory have significantly realized the most authentic spirit of the International Quantum Structures Association: an original research about hard technical problems, which are often "entangled" with the emergence of important changes in our general world-conceptions. During a time where both the logical and the physical community often showed a skeptical attitude towards Birkhoff and von Neumann's quantum logic, Mittelstaedt brought into light the deeply innovating features of a quantum logical thinking that allows us to overcome some strong and unrealistic assumptions of classical logical arguments. Later on his intense research on the unsharp approach to quantum theory and to the measurement problem stimulated the increasing interest for unsharp forms of quantum logic, creating a fruitful interaction between the work of quantum logicians and of many-valued logicians. Mittelstaedt's general views about quantum logic and quantum theory seem to be inspired by a conjecture that is today more and more confirmed: there is something universal in the quantum theoretic formalism that goes beyond the limits of microphysics, giving rise to interesting applications to a number of different fields.

  5. Use of Knowledge Base Systems (EMDS) in Strategic and Tactical Forest Planning

    NASA Astrophysics Data System (ADS)

    Jensen, M. E.; Reynolds, K.; Stockmann, K.

    2008-12-01

    The USDA Forest Service 2008 Planning Rule requires Forest plans to provide a strategic vision for maintaining the sustainability of ecological, economic, and social systems across USFS lands through the identification of desired conditions and objectives. In this paper we show how knowledge-based systems can be efficiently used to evaluate disparate natural resource information to assess desired conditions and related objectives in Forest planning. We use the Ecosystem Management Decision Support (EMDS) system (http://www.institute.redlands.edu/emds/), which facilitates development of both logic-based models for evaluating ecosystem sustainability (desired conditions) and decision models to identify priority areas for integrated landscape restoration (objectives). The study area for our analysis spans 1,057 subwatersheds within western Montana and northern Idaho. Results of our study suggest that knowledge-based systems such as EMDS are well suited to both strategic and tactical planning and that the following points merit consideration in future National Forest (and other land management) planning efforts: 1) Logic models provide a consistent, transparent, and reproducible method for evaluating broad propositions about ecosystem sustainability such as: are watershed integrity, ecosystem and species diversity, social opportunities, and economic integrity in good shape across a planning area? The ability to evaluate such propositions in a formal logic framework also allows users the opportunity to evaluate statistical changes in outcomes over time, which could be very useful for regional and national reporting purposes and for addressing litigation; 2) The use of logic and decision models in strategic and tactical Forest planning provides a repository for expert knowledge (corporate memory) that is critical to the evaluation and management of ecosystem sustainability over time. This is especially true for the USFS and other federal resource agencies, which are likely to experience rapid turnover in tenured resource specialist positions within the next five years due to retirements; 3) Use of logic model output in decision models is an efficient method for synthesizing the typically large amounts of information needed to support integrated landscape restoration. Moreover, use of logic and decision models to design customized scenarios for integrated landscape restoration, as we have demonstrated with EMDS, offers substantial improvements to traditional GIS-based procedures such as suitability analysis. To our knowledge, this study represents the first attempt to link evaluations of desired conditions for ecosystem sustainability in strategic planning to tactical planning regarding the location of subwatersheds that best meet the objectives of integrated landscape restoration. The basic knowledge-based approach implemented in EMDS, with its logic (NetWeaver) and decision (Criterion Decision Plus) engines, is well suited both to multi-scale strategic planning and to multi-resource tactical planning.

  6. Dual Logic and Cerebral Coordinates for Reciprocal Interaction in Eye Contact

    PubMed Central

    Lee, Ray F.

    2015-01-01

    In order to scientifically study the human brain’s response to face-to-face social interaction, the scientific method itself needs to be reconsidered so that both quantitative observation and symbolic reasoning can be adapted to the situation where the observer is also observed. In light of the recent development of dyadic fMRI which can directly observe dyadic brain interacting in one MRI scanner, this paper aims to establish a new form of logic, dual logic, which provides a theoretical platform for deductive reasoning in a complementary dual system with emergence mechanism. Applying the dual logic in the dfMRI experimental design and data analysis, the exogenous and endogenous dual systems in the BOLD responses can be identified; the non-reciprocal responses in the dual system can be suppressed; a cerebral coordinate for reciprocal interaction can be generated. Elucidated by dual logic deductions, the cerebral coordinate for reciprocal interaction suggests: the exogenous and endogenous systems consist of the empathy network and the mentalization network respectively; the default-mode network emerges from the resting state to activation in the endogenous system during reciprocal interaction; the cingulate plays an essential role in the emergence from the exogenous system to the endogenous system. Overall, the dual logic deductions are supported by the dfMRI experimental results and are consistent with current literature. Both the theoretical framework and experimental method set the stage to formally apply the scientific method in studying complex social interaction. PMID:25885446

  7. Winnicott and Derrida: development of logic-of-play.

    PubMed

    Bitan, Shachaf

    2012-02-01

    In this essay I develop the logic of play from the writings of the British psychoanalyst Donald W. Winnicott and the French philosopher Jacques Derrida. The logic of play serves as both a conceptual framework for theoretical clinical thinking and a space of experiencing in which the therapeutic situation is located and to which it aspires. I argue that both Winnicott and Derrida proposed a playful turn in Western thinking by their attitude towards oppositions, viewing them not as complementary or contradictory, but as 'peacefully-coexisting'. Derrida criticizes the dichotomous structure of Western thought, proposing playful movement as an alternative that does not constitute itself as a mastering construction. I will show that Winnicott, too, proposes playful logic through which he thinks and acts in the therapeutic situation. The therapeutic encounter is understood as a playful space in which analyst and analysand continuously coexist, instead of facing each other as exclusionary oppositions. I therefore propose the logic of play as the basis for the therapeutic encounter. The playful turn, then, is crucial for the thought and praxis expressed by the concept of two-person psychology. I suggest the term playful psychoanalysis to characterize the present perspective of psychoanalysis in the light of the playful turn. I will first present Derrida's playful thought, go on to Winnicott's playful revolutionism, and conclude with an analysis of Winicott's clinical material in the light of the logic of play. Copyright © 2012 Institute of Psychoanalysis.

  8. FELERION: a new approach for leakage power reduction

    NASA Astrophysics Data System (ADS)

    R, Anjana; Somkuwar, Ajay

    2014-12-01

    The circuit proposed in this paper simultaneously reduces the sub threshold leakage power and saves the state of art aspect of the logic circuits. Sleep transistors and PMOS-only logic are used to further reduce the leakage power. Sleep transistors are used as the keepers to reduce the sub threshold leakage current providing the low resistance path to the output. PMOS-only logic is used between the pull up and pull down devices to mitigate the leakage power further. Our proposed fast efficient leakage reduction circuit not only reduces the leakage current but also reduces the power dissipation. Power and delay are analyzed at the 32 nm BSIM4 model for a chain of four inverters, NAND, NOR and ISCAS-85 c17 benchmark circuits using DSCH3 and the Microwind tool. The simulation results reveal that our proposed approach mitigates leakage power by 90%-94% as compared to the conventional approach.

  9. Autonomous vehicle motion control, approximate maps, and fuzzy logic

    NASA Technical Reports Server (NTRS)

    Ruspini, Enrique H.

    1993-01-01

    Progress on research on the control of actions of autonomous mobile agents using fuzzy logic is presented. The innovations described encompass theoretical and applied developments. At the theoretical level, results of research leading to the combined utilization of conventional artificial planning techniques with fuzzy logic approaches for the control of local motion and perception actions are presented. Also formulations of dynamic programming approaches to optimal control in the context of the analysis of approximate models of the real world are examined. Also a new approach to goal conflict resolution that does not require specification of numerical values representing relative goal importance is reviewed. Applied developments include the introduction of the notion of approximate map. A fuzzy relational database structure for the representation of vague and imprecise information about the robot's environment is proposed. Also the central notions of control point and control structure are discussed.

  10. Binary information propagation in circular magnetic nanodot arrays using strain induced magnetic anisotropy

    NASA Astrophysics Data System (ADS)

    Salehi-Fashami, M.; Al-Rashid, M.; Sun, Wei-Yang; Nordeen, P.; Bandyopadhyay, S.; Chavez, A. C.; Carman, G. P.; Atulasimha, J.

    2016-10-01

    Nanomagnetic logic has emerged as a potential replacement for traditional Complementary Metal Oxide Semiconductor (CMOS) based logic because of superior energy-efficiency (Salahuddin and Datta 2007 Appl. Phys. Lett. 90 093503, Cowburn and Welland 2000 Science 287 1466-68). One implementation of nanomagnetic logic employs shape-anisotropic (e.g. elliptical) ferromagnets (with two stable magnetization orientations) as binary switches that rely on dipole-dipole interaction to communicate binary information (Cowburn and Welland 2000 Science 287 1466-8, Csaba et al 2002 IEEE Trans. Nanotechnol. 1 209-13, Carlton et al 2008 Nano Lett. 8 4173-8, Atulasimha and Bandyopadhyay 2010 Appl. Phys. Lett. 97 173105, Roy et al 2011 Appl. Phys. Lett. 99 063108, Fashami et al 2011 Nanotechnology 22 155201, Tiercelin et al 2011 Appl. Phys. Lett. 99 , Alam et al 2010 IEEE Trans. Nanotechnol. 9 348-51 and Bhowmik et al 2013 Nat. Nanotechnol. 9 59-63). Normally, circular nanomagnets are incompatible with this approach since they lack distinct stable in-plane magnetization orientations to encode bits. However, circular magnetoelastic nanomagnets can be made bi-stable with a voltage induced anisotropic strain, which provides two significant advantages for nanomagnetic logic applications. First, the shape-anisotropy energy barrier is eliminated which reduces the amount of energy required to reorient the magnetization. Second, the in-plane size can be reduced (˜20 nm) which was previously not possible due to thermal stability issues. In circular magnetoelastic nanomagnets, a voltage induced strain stabilizes the magnetization even at this size overcoming the thermal stability issue. In this paper, we analytically demonstrate the feasibility of a binary ‘logic wire’ implemented with an array of circular nanomagnets that are clocked with voltage-induced strain applied by an underlying piezoelectric substrate. This leads to an energy-efficient logic paradigm orders of magnitude superior to existing CMOS-based logic that is scalable to dimensions substantially smaller than those for existing nanomagnetic logic approaches. The analytical approach is validated with experimental measurements conducted on dipole coupled Nickel (Ni) nanodots fabricated on a PMN-PT (Lead Magnesium Niobate-Lead Titanate) sample.

  11. Using logic models in a community-based agricultural injury prevention project.

    PubMed

    Helitzer, Deborah; Willging, Cathleen; Hathorn, Gary; Benally, Jeannie

    2009-01-01

    The National Institute for Occupational Safety and Health has long promoted the logic model as a useful tool in an evaluator's portfolio. Because a logic model supports a systematic approach to designing interventions, it is equally useful for program planners. Undertaken with community stakeholders, a logic model process articulates the underlying foundations of a particular programmatic effort and enhances program design and evaluation. Most often presented as sequenced diagrams or flow charts, logic models demonstrate relationships among the following components: statement of a problem, various causal and mitigating factors related to that problem, available resources to address the problem, theoretical foundations of the selected intervention, intervention goals and planned activities, and anticipated short- and long-term outcomes. This article describes a case example of how a logic model process was used to help community stakeholders on the Navajo Nation conceive, design, implement, and evaluate agricultural injury prevention projects.

  12. Formal Compiler Implementation in a Logical Framework

    DTIC Science & Technology

    2003-04-29

    variable set [], we omit the brackets and use the simpler notation v. MetaPRL is a tactic-based prover that uses OCaml [20] as its meta-language. When a...rewrite is defined in MetaPRL, the framework creates an OCaml expression that can be used to apply the rewrite. Code to guide the application of...rewrites is written in OCaml , using a rich set of primitives provided by MetaPRL. MetaPRL automates the construction of most guidance code; we describe

  13. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  14. Developing a good practice model to evaluate the effectiveness of comprehensive primary health care in local communities

    PubMed Central

    2014-01-01

    Background This paper describes the development of a model of Comprehensive Primary Health Care (CPHC) applicable to the Australian context. CPHC holds promise as an effective model of health system organization able to improve population health and increase health equity. However, there is little literature that describes and evaluates CPHC as a whole, with most evaluation focusing on specific programs. The lack of a consensus on what constitutes CPHC, and the complex and context-sensitive nature of CPHC are all barriers to evaluation. Methods The research was undertaken in partnership with six Australian primary health care services: four state government funded and managed services, one sexual health non-government organization, and one Aboriginal community controlled health service. A draft model was crafted combining program logic and theory-based approaches, drawing on relevant literature, 68 interviews with primary health care service staff, and researcher experience. The model was then refined through an iterative process involving two to three workshops at each of the six participating primary health care services, engaging health service staff, regional health executives and central health department staff. Results The resultant Southgate Model of CPHC in Australia model articulates the theory of change of how and why CPHC service components and activities, based on the theory, evidence and values which underpin a CPHC approach, are likely to lead to individual and population health outcomes and increased health equity. The model captures the importance of context, the mechanisms of CPHC, and the space for action services have to work within. The process of development engendered and supported collaborative relationships between researchers and stakeholders and the product provided a description of CPHC as a whole and a framework for evaluation. The model was endorsed at a research symposium involving investigators, service staff, and key stakeholders. Conclusions The development of a theory-based program logic model provided a framework for evaluation that allows the tracking of progress towards desired outcomes and exploration of the particular aspects of context and mechanisms that produce outcomes. This is important because there are no existing models which enable the evaluation of CPHC services in their entirety. PMID:24885812

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  16. An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty

    PubMed Central

    Langlotz, Curtis P.; Shortliffe, Edward H.

    1988-01-01

    Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.

  17. Some practical approaches to a course on paraconsistent logic for engineers

    NASA Astrophysics Data System (ADS)

    Lambert-Torres, Germano; de Moraes, Carlos Henrique Valerio; Coutinho, Maurilio Pereira; Martins, Helga Gonzaga; Borges da Silva, Luiz Eduardo

    2017-11-01

    This paper describes a non-classical logic course primarily indicated for graduate students in electrical engineering and energy engineering. The content of this course is based on the vision that it is not enough for a student to indefinitely accumulate knowledge; it is necessary to explore all the occasions to update, deepen, and enrich that knowledge, adapting it to a complex world. Therefore, this course is not tied to theoretical formalities and tries at each moment to provide a practical view of the non-classical logic. In the real world, the inconsistencies are important and cannot be ignored because contradictory information brings relevant facts, sometimes modifying the entire result of the analysis. As consequence, the non-classical logics, such as annotated paraconsistent logic - APL, are efficiently framed in the approach of complex situations of the real world. In APL, the concepts of unknown, partial, ambiguous, and inconsistent knowledge are referred not to trivialise any system in analysis. This course presents theoretical and applicable aspects of APL, which are successfully used in decision-making structures. The course is divided into modules: Basic, 2vAPL, 3vAPL, 4vAPL, and Final Project.

  18. Surface-confined assemblies and polymers for molecular logic.

    PubMed

    de Ruiter, Graham; van der Boom, Milko E

    2011-08-16

    Stimuli responsive materials are capable of mimicking the operation characteristics of logic gates such as AND, OR, NOR, and even flip-flops. Since the development of molecular sensors and the introduction of the first AND gate in solution by de Silva in 1993, Molecular (Boolean) Logic and Computing (MBLC) has become increasingly popular. In this Account, we present recent research activities that focus on MBLC with electrochromic polymers and metal polypyridyl complexes on a solid support. Metal polypyridyl complexes act as useful sensors to a variety of analytes in solution (i.e., H(2)O, Fe(2+/3+), Cr(6+), NO(+)) and in the gas phase (NO(x) in air). This information transfer, whether the analyte is present, is based on the reversible redox chemistry of the metal complexes, which are stable up to 200 °C in air. The concurrent changes in the optical properties are nondestructive and fast. In such a setup, the input is directly related to the output and, therefore, can be represented by one-input logic gates. These input-output relationships are extendable for mimicking the diverse functions of essential molecular logic gates and circuits within a set of Boolean algebraic operations. Such a molecular approach towards Boolean logic has yielded a series of proof-of-concept devices: logic gates, multiplexers, half-adders, and flip-flop logic circuits. MBLC is a versatile and, potentially, a parallel approach to silicon circuits: assemblies of these molecular gates can perform a wide variety of logic tasks through reconfiguration of their inputs. Although these developments do not require a semiconductor blueprint, similar guidelines such as signal propagation, gate-to-gate communication, propagation delay, and combinatorial and sequential logic will play a critical role in allowing this field to mature. For instance, gate-to-gate communication by chemical wiring of the gates with metal ions as electron carriers results in the integration of stand-alone systems: the output of one gate is used as the input for another gate. Using the same setup, we were able to display both combinatorial and sequential logic. We have demonstrated MBLC by coupling electrochemical inputs with optical readout, which resulted in various logic architectures built on a redox-active, functionalized surface. Electrochemically operated sequential logic systems such as flip-flops, multivalued logic, and multistate memory could enhance computational power without increasing spatial requirements. Applying multivalued digits in data storage could exponentially increase memory capacity. Furthermore, we evaluate the pros and cons of MBLC and identify targets for future research in this Account. © 2011 American Chemical Society

  19. Second Language Acquisition and Universal Grammar.

    ERIC Educational Resources Information Center

    White, Lydia

    1990-01-01

    Discusses the motivation for Universal Grammar (UG), as assumed in the principles and parameters framework of generative grammar (Chomsky, 1981), focusing on the logical problem of first-language acquisition and the potential role of UG in second-language acquisition. Recent experimental research regarding the second-language status of the…

  20. An integrated environmental risk assessment and management framework for enhancing the sustainability of marine protected areas: the Cape d'Aguilar Marine Reserve case study in Hong Kong.

    PubMed

    Xu, Elvis G B; Leung, Kenneth M Y; Morton, Brian; Lee, Joseph H W

    2015-02-01

    Marine protected areas (MPAs), such as marine parks and reserves, contain natural resources of immense value to the environment and mankind. Since MPAs may be situated in close proximity to urbanized areas and influenced by anthropogenic activities (e.g. continuous discharges of contaminated waters), the marine organisms contained in such waters are probably at risk. This study aimed at developing an integrated environmental risk assessment and management (IERAM) framework for enhancing the sustainability of such MPAs. The IERAM framework integrates conventional environmental risk assessment methods with a multi-layer-DPSIR (Driver-Pressure-State-Impact-Response) conceptual approach, which can simplify the complex issues embraced by environmental management strategies and provide logical and concise management information. The IERAM process can generate a useful database, offer timely update on the status of MPAs, and assist in the prioritization of management options. We use the Cape d'Aguilar Marine Reserve in Hong Kong as an example to illustrate the IERAM framework. A comprehensive set of indicators were selected, aggregated and analyzed using this framework. Effects of management practices and programs were also assessed by comparing the temporal distributions of these indicators over a certain timeframe. Based on the obtained results, we have identified the most significant components for safeguarding the integrity of the marine reserve, and indicated the existing information gaps concerned with the management of the reserve. Apart from assessing the MPA's present condition, a successful implementation of the IERAM framework as evocated here would also facilitate better-informed decision-making and, hence, indirectly enhance the protection and conservation of the MPA's marine biodiversity. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. A fuzzy logic approach to modeling the underground economy in Taiwan

    NASA Astrophysics Data System (ADS)

    Yu, Tiffany Hui-Kuang; Wang, David Han-Min; Chen, Su-Jane

    2006-04-01

    The size of the ‘underground economy’ (UE) is valuable information in the formulation of macroeconomic and fiscal policy. This study applies fuzzy set theory and fuzzy logic to model Taiwan's UE over the period from 1960 to 2003. Two major factors affecting the size of the UE, the effective tax rate and the degree of government regulation, are used. The size of Taiwan's UE is scaled and compared with those of other models. Although our approach yields different estimates, similar patterns and leading are exhibited throughout the period. The advantage of applying fuzzy logic is twofold. First, it can avoid the complex calculations in conventional econometric models. Second, fuzzy rules with linguistic terms are easy for human to understand.

  2. Characterizing the EPODE logic model: unravelling the past and informing the future.

    PubMed

    Van Koperen, T M; Jebb, S A; Summerbell, C D; Visscher, T L S; Romon, M; Borys, J M; Seidell, J C

    2013-02-01

    EPODE ('Ensemble Prévenons l'Obésité De Enfants' or 'Together let's Prevent Childhood Obesity') is a large-scale, centrally coordinated, capacity-building approach for communities to implement effective and sustainable strategies to prevent childhood obesity. Since 2004, EPODE has been implemented in over 500 communities in six countries. Although based on emergent practice and scientific knowledge, EPODE, as many community programs, lacks a logic model depicting key elements of the approach. The objective of this study is to gain insight in the dynamics and key elements of EPODE and to represent these in a schematic logic model. EPODE's process manuals and documents were collected and interviews were held with professionals involved in the planning and delivery of EPODE. Retrieved data were coded, themed and placed in a four-level logic model. With input from international experts, this model was scaled down to a concise logic model covering four critical components: political commitment, public and private partnerships, social marketing and evaluation. The EPODE logic model presented here can be used as a reference for future and follow-up research; to support future implementation of EPODE in communities; as a tool in the engagement of stakeholders; and to guide the construction of a locally tailored evaluation plan. © 2012 The Authors. obesity reviews © 2012 International Association for the Study of Obesity.

  3. ASICs Approach for the Implementation of a Symmetric Triangular Fuzzy Coprocessor and Its Application to Adaptive Filtering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan

    1997-01-01

    This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.

  4. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  5. Boolean Approaches in Digital Diagnosis

    DTIC Science & Technology

    1989-12-04

    Automation Conference, pages 64-70, 1983. 16. Barry W. Johnson. Design and A nalysis of Fault-Tolerant Digital Systems. Addison- Wesley Publishing...Mitchell. On a new algebra of logic. In C.S. Peirce, edhitor, Studies in Logic. Little, Brown. Boston. 1883. 2:3. Roger S. Pressman . Softwrare Engineering

  6. Logical-rule models of classification response times: a synthesis of mental-architecture, random-walk, and decision-bound approaches.

    PubMed

    Fific, Mario; Little, Daniel R; Nosofsky, Robert M

    2010-04-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli along a set of component dimensions. Those independent decisions are then combined via logical rules to determine the overall categorization response. The time course of the independent decisions is modeled via random-walk processes operating along individual dimensions. Alternative mental architectures are used as mechanisms for combining the independent decisions to implement the logical rules. We derive fundamental qualitative contrasts for distinguishing among the predictions of the rule models and major alternative models of classification RT. We also use the models to predict detailed RT-distribution data associated with individual stimuli in tasks of speeded perceptual classification. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  7. Fuzzy logic applications to control engineering

    NASA Astrophysics Data System (ADS)

    Langari, Reza

    1993-12-01

    This paper presents the results of a project presently under way at Texas A&M which focuses on the use of fuzzy logic in integrated control of manufacturing systems. The specific problems investigated here include diagnosis of critical tool wear in machining of metals via a neuro-fuzzy algorithm, as well as compensation of friction in mechanical positioning systems via an adaptive fuzzy logic algorithm. The results indicate that fuzzy logic in conjunction with conventional algorithmic based approaches or neural nets can prove useful in dealing with the intricacies of control/monitoring of manufacturing systems and can potentially play an active role in multi-modal integrated control systems of the future.

  8. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1995-01-01

    Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  9. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  10. Semantics-enabled service discovery framework in the SIMDAT pharma grid.

    PubMed

    Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert

    2008-03-01

    We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.

  11. TOWARD INTEGRATION IN THE CONTEXT OF HEALTH TECHNOLOGY ASSESSMENT: THE NEED FOR EVALUATIVE FRAMEWORKS.

    PubMed

    van der Wilt, Gert Jan; Gerhardus, Ansgar; Oortwijn, Wija

    2017-01-01

    A comprehensive health technology assessment (HTA) enables a patient-centered assessment of the effectiveness, economic, ethical, socio-cultural, and legal issues of health technologies that takes context and implementation into account. A question is whether these various pieces of evidence need to be integrated, and if so, how that might be achieved. The objective of our study is to discuss the meaning of integration in the context of HTA and suggest how it may be achieved in a more structured way. An analysis of the concept of integration in the context of HTA and a review of approaches that were adopted in the INTEGRATE-HTA project that may support integration. Current approaches to integration in HTA are mainly methods of commensuration, which are not optimally geared to support public deliberation. In contrast, articulating evaluative frameworks could be an important means of integration which allows for exploring how facts and values can be brought to bear on each other. Integration is not something that only needs to be addressed at the end, but rather throughout an HTA, right from the start. Integration can be conceived as a matter of accounting for the relevance of empirical evidence in view of a commitment to a set of potentially conflicting values. Various elements of the INTEGRATE-HTA project, such as scoping and the development of logic models, can help to achieve integration in HTA.

  12. S3DB core: a framework for RDF generation and management in bioinformatics infrastructures

    PubMed Central

    2010-01-01

    Background Biomedical research is set to greatly benefit from the use of semantic web technologies in the design of computational infrastructure. However, beyond well defined research initiatives, substantial issues of data heterogeneity, source distribution, and privacy currently stand in the way towards the personalization of Medicine. Results A computational framework for bioinformatic infrastructure was designed to deal with the heterogeneous data sources and the sensitive mixture of public and private data that characterizes the biomedical domain. This framework consists of a logical model build with semantic web tools, coupled with a Markov process that propagates user operator states. An accompanying open source prototype was developed to meet a series of applications that range from collaborative multi-institution data acquisition efforts to data analysis applications that need to quickly traverse complex data structures. This report describes the two abstractions underlying the S3DB-based infrastructure, logical and numerical, and discusses its generality beyond the immediate confines of existing implementations. Conclusions The emergence of the "web as a computer" requires a formal model for the different functionalities involved in reading and writing to it. The S3DB core model proposed was found to address the design criteria of biomedical computational infrastructure, such as those supporting large scale multi-investigator research, clinical trials, and molecular epidemiology. PMID:20646315

  13. Comprehensive Fault Tolerance and Science-Optimal Attitude Planning for Spacecraft Applications

    NASA Astrophysics Data System (ADS)

    Nasir, Ali

    Spacecraft operate in a harsh environment, are costly to launch, and experience unavoidable communication delay and bandwidth constraints. These factors motivate the need for effective onboard mission and fault management. This dissertation presents an integrated framework to optimize science goal achievement while identifying and managing encountered faults. Goal-related tasks are defined by pointing the spacecraft instrumentation toward distant targets of scientific interest. The relative value of science data collection is traded with risk of failures to determine an optimal policy for mission execution. Our major innovation in fault detection and reconfiguration is to incorporate fault information obtained from two types of spacecraft models: one based on the dynamics of the spacecraft and the second based on the internal composition of the spacecraft. For fault reconfiguration, we consider possible changes in both dynamics-based control law configuration and the composition-based switching configuration. We formulate our problem as a stochastic sequential decision problem or Markov Decision Process (MDP). To avoid the computational complexity involved in a fully-integrated MDP, we decompose our problem into multiple MDPs. These MDPs include planning MDPs for different fault scenarios, a fault detection MDP based on a logic-based model of spacecraft component and system functionality, an MDP for resolving conflicts between fault information from the logic-based model and the dynamics-based spacecraft models" and the reconfiguration MDP that generates a policy optimized over the relative importance of the mission objectives versus spacecraft safety. Approximate Dynamic Programming (ADP) methods for the decomposition of the planning and fault detection MDPs are applied. To show the performance of the MDP-based frameworks and ADP methods, a suite of spacecraft attitude planning case studies are described. These case studies are used to analyze the content and behavior of computed policies in response to the changes in design parameters. A primary case study is built from the Far Ultraviolet Spectroscopic Explorer (FUSE) mission for which component models and their probabilities of failure are based on realistic mission data. A comparison of our approach with an alternative framework for spacecraft task planning and fault management is presented in the context of the FUSE mission.

  14. Ontology-Based Learner Categorization through Case Based Reasoning and Fuzzy Logic

    ERIC Educational Resources Information Center

    Sarwar, Sohail; García-Castro, Raul; Qayyum, Zia Ul; Safyan, Muhammad; Munir, Rana Faisal

    2017-01-01

    Learner categorization has a pivotal role in making e-learning systems a success. However, learner characteristics exploited at abstract level of granularity by contemporary techniques cannot categorize the learners effectively. In this paper, an architecture of e-learning framework has been presented that exploits the machine learning based…

  15. Cyber Power Potential of the Army’s Reserve Component

    DTIC Science & Technology

    2017-01-01

    and could extend logically to include electric power, water, food, railway, gas pipelines , and so forth. One consideration to note is that in cases...29 CHAPTER FOUR Army Reserve Component Cyber Inventory Analysis .......................... 31...Background and Analytical Framework ........................................................... 31 Army Reserve Component Cyber Inventory Analysis , 2015

  16. Consensus Knowledge Acquisition

    DTIC Science & Technology

    1989-12-01

    ex- plicit the logical structure of their positions. Structured frameworks for analyzing 3 SOME USEFUL IDEAS 3 arguments ( Toulmin , 1958; Fogelin, 1982...358-87, 1987. Stefik M, et al., Beyond the chalkboard, CACM, 30:1, Jan 1987, pp. 32-47. Toulmin , S. The Uses of Argument. Cambridge, England: Cambridge University Press, 1958. 01

  17. Some Thoughts on John Dewey's Ethics and Education

    ERIC Educational Resources Information Center

    Karafillis, Gregorios

    2012-01-01

    The philosopher and educator, John Dewey, explores the emergence of the terms "ethics" and "education" from a pragmatist's perspective, i.e., within the linguistic and social components' framework, and society's existing cognitive and cultural level. In the current article, we examine the development, logical control and the relation between…

  18. Rejoinder to Guterman, Martin, and Kopp

    ERIC Educational Resources Information Center

    Hansen, James T.

    2012-01-01

    In their reply to the author's keystone article (Hansen, 2012), Guterman, Martin, and Kopp (2012) charge that the author's integrative framework was not sufficiently integrative. They also argue that his proposal results in logical contradictions and the mind-body problem. The author responds by noting that his proposal fully integrates the…

  19. Testing for Factorial Invariance in the Context of Construct Validation

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    2010-01-01

    This article describes the logic and procedures behind testing for factorial invariance across groups in the context of construct validation. The procedures include testing for configural, measurement, and structural invariance in the framework of multiple-group confirmatory factor analysis (CFA). The "forward" (sequential constraint imposition)…

  20. Logic Modeling in Quantitative Systems Pharmacology

    PubMed Central

    Traynard, Pauline; Tobalina, Luis; Eduati, Federica; Calzone, Laurence

    2017-01-01

    Here we present logic modeling as an approach to understand deregulation of signal transduction in disease and to characterize a drug's mode of action. We discuss how to build a logic model from the literature and experimental data and how to analyze the resulting model to obtain insights of relevance for systems pharmacology. Our workflow uses the free tools OmniPath (network reconstruction from the literature), CellNOpt (model fit to experimental data), MaBoSS (model analysis), and Cytoscape (visualization). PMID:28681552

  1. Framework for a clinical information system.

    PubMed

    Van de Velde, R

    2000-01-01

    The current status of our work towards the design and implementation of a reference architecture for a Clinical Information System is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the 'middle' tier apply the clinical (business) model and application rules to communicate with so-called 'thin client' workstations. The main characteristics are the focus on modelling and reuse of both data and business logic as there is a shift away from data and functional modelling towards object modelling. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  2. A hierarchy of unhealthy food promotion effects: identifying methodological approaches and knowledge gaps.

    PubMed

    Kelly, Bridget; King MPsy, Lesley; Chapman Mnd, Kathy; Boyland, Emma; Bauman, Adrian E; Baur, Louise A

    2015-04-01

    We assessed the evidence for a conceptual "hierarchy of effects" of marketing, to guide understanding of the relationship between children's exposure to unhealthy food marketing and poor diets and overweight, and drive the research agenda. We reviewed studies assessing the impact of food promotions on children from MEDLINE, Web of Science, ABI Inform, World Health Organization library database, and The Gray Literature Report. We included articles published in English from 2009 to 2013, with earlier articles from a 2009 systematic review. We grouped articles by outcome of exposure and assessed outcomes within a framework depicting a hierarchy of effects of marketing exposures. Evidence supports a logical sequence of effects linking food promotions to individual-level weight outcomes. Future studies should demonstrate the sustained effects of marketing exposure, and exploit variations in exposures to assess differences in outcomes longitudinally.

  3. Analysis of Critical Infrastructure Dependencies and Interdependencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petit, Frederic; Verner, Duane; Brannegan, David

    2015-06-01

    The report begins by defining dependencies and interdependencies and exploring basic concepts of dependencies in order to facilitate a common understanding and consistent analytical approaches. Key concepts covered include; Characteristics of dependencies: upstream dependencies, internal dependencies, and downstream dependencies; Classes of dependencies: physical, cyber, geographic, and logical; and Dimensions of dependencies: operating environment, coupling and response behavior, type of failure, infrastructure characteristics, and state of operations From there, the report proposes a multi-phase roadmap to support dependency and interdependency assessment activities nationwide, identifying a range of data inputs, analysis activities, and potential products for each phase, as well as keymore » steps needed to progress from one phase to the next. The report concludes by outlining a comprehensive, iterative, and scalable framework for analyzing dependencies and interdependencies that stakeholders can integrate into existing risk and resilience assessment efforts.« less

  4. The range of options for handling plane angle and solid angle within a system of units

    NASA Astrophysics Data System (ADS)

    Quincey, Paul

    2016-04-01

    The radian and steradian are unusual units within the SI, originally belonging to their own category of ‘supplementary units’, with this status being changed to dimensionless ‘derived units’ in 1995. Recent papers have suggested that angles could be handled in two different ways within the SI, both differing from the present system. The purpose of this paper is to provide a framework for putting such suggestions into context, outlining the range of options that is available, together with the advantages and disadvantages of these options. Although less rigorously logical than some alternatives, the present SI approach is generally supported, but with some changes to the SI brochure to make the position clearer, in particular with regard to the designation of the radian and steradian as derived units.

  5. Development of Algorithms for Control of Humidity in Plant Growth Chambers

    NASA Technical Reports Server (NTRS)

    Costello, Thomas A.

    2003-01-01

    Algorithms were developed to control humidity in plant growth chambers used for research on bioregenerative life support at Kennedy Space Center. The algorithms used the computed water vapor pressure (based on measured air temperature and relative humidity) as the process variable, with time-proportioned outputs to operate the humidifier and de-humidifier. Algorithms were based upon proportional-integral-differential (PID) and Fuzzy Logic schemes and were implemented using I/O Control software (OPTO-22) to define and download the control logic to an autonomous programmable logic controller (PLC, ultimate ethernet brain and assorted input-output modules, OPTO-22), which performed the monitoring and control logic processing, as well the physical control of the devices that effected the targeted environment in the chamber. During limited testing, the PLC's successfully implemented the intended control schemes and attained a control resolution for humidity of less than 1%. The algorithms have potential to be used not only with autonomous PLC's but could also be implemented within network-based supervisory control programs. This report documents unique control features that were implemented within the OPTO-22 framework and makes recommendations regarding future uses of the hardware and software for biological research by NASA.

  6. LEGO-MM: LEarning structured model by probabilistic loGic Ontology tree for MultiMedia.

    PubMed

    Tang, Jinhui; Chang, Shiyu; Qi, Guo-Jun; Tian, Qi; Rui, Yong; Huang, Thomas S

    2016-09-22

    Recent advances in Multimedia ontology have resulted in a number of concept models, e.g., LSCOM and Mediamill 101, which are accessible and public to other researchers. However, most current research effort still focuses on building new concepts from scratch, very few work explores the appropriate method to construct new concepts upon the existing models already in the warehouse. To address this issue, we propose a new framework in this paper, termed LEGO1-MM, which can seamlessly integrate both the new target training examples and the existing primitive concept models to infer the more complex concept models. LEGOMM treats the primitive concept models as the lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, we first formulate the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. Then, we incorporate new target training information simultaneously to efficiently disambiguate the underlying logic tree and correct the error propagation. Extensive experiments are conducted on a large vehicle domain data set from ImageNet. The results demonstrate that LEGO-MM has significantly superior performance over existing state-of-the-art methods, which build new concept models from scratch.

  7. Design of Learning Model of Logic and Algorithms Based on APOS Theory

    ERIC Educational Resources Information Center

    Hartati, Sulis Janu

    2014-01-01

    This research questions were "how do the characteristics of learning model of logic & algorithm according to APOS theory" and "whether or not these learning model can improve students learning outcomes". This research was conducted by exploration, and quantitative approach. Exploration used in constructing theory about the…

  8. Professional Learning: A Fuzzy Logic-Based Modelling Approach

    ERIC Educational Resources Information Center

    Gravani, M. N.; Hadjileontiadou, S. J.; Nikolaidou, G. N.; Hadjileontiadis, L. J.

    2007-01-01

    Studies have suggested that professional learning is influenced by two key parameters, i.e., climate and planning, and their associated variables (mutual respect, collaboration, mutual trust, supportiveness, openness). In this paper, we applied analysis of the relationships between the proposed quantitative, fuzzy logic-based model and a series of…

  9. Graphene-based aptamer logic gates and their application to multiplex detection.

    PubMed

    Wang, Li; Zhu, Jinbo; Han, Lei; Jin, Lihua; Zhu, Chengzhou; Wang, Erkang; Dong, Shaojun

    2012-08-28

    In this work, a GO/aptamer system was constructed to create multiplex logic operations and enable sensing of multiplex targets. 6-Carboxyfluorescein (FAM)-labeled adenosine triphosphate binding aptamer (ABA) and FAM-labeled thrombin binding aptamer (TBA) were first adsorbed onto graphene oxide (GO) to form a GO/aptamer complex, leading to the quenching of the fluorescence of FAM. We demonstrated that the unique GO/aptamer interaction and the specific aptamer-target recognition in the target/GO/aptamer system were programmable and could be utilized to regulate the fluorescence of FAM via OR and INHIBIT logic gates. The fluorescence changed according to different input combinations, and the integration of OR and INHIBIT logic gates provided an interesting approach for logic sensing applications where multiple target molecules were present. High-throughput fluorescence imagings that enabled the simultaneous processing of many samples by using the combinatorial logic gates were realized. The developed logic gates may find applications in further development of DNA circuits and advanced sensors for the identification of multiple targets in complex chemical environments.

  10. An integrated modeling approach to support management decisions of coupled groundwater-agricultural systems under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens

    2015-04-01

    The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.

  11. [Sociopolitical determinants of international health policies].

    PubMed

    De Vos, Pol; Van der Stuyft, Patrick

    2013-04-01

    For decades, two opposing logics dominate the health policy debate: A comprehensive health care approach, with the 1978 Alma Ata Declaration as its cornerstone, and private competition logic, emphasizing the role of the private sector. We present this debate and its influence on international health policies in the context of changing global economic and sociopolitical power relations. The neoliberal approach is illustrated with Chile's health sector reform in the 1980s and the Colombian reform since 1993. The comprehensive 'public logic' is shown through the social insurance models in Costa Rica and in Brazil, and through the national public health systems in Cuba since 1959, and in Nicaragua -during the 1980s. These experiences emphasize that health (care) systems do not naturally gravitate towards greater fairness and efficiency, but that they require deliberate policy decisions.

  12. Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications

    DTIC Science & Technology

    1992-09-01

    STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach

  13. Classical Limit and Quantum Logic

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Fortin, Sebastian; Holik, Federico

    2018-02-01

    The analysis of the classical limit of quantum mechanics usually focuses on the state of the system. The general idea is to explain the disappearance of the interference terms of quantum states appealing to the decoherence process induced by the environment. However, in these approaches it is not explained how the structure of quantum properties becomes classical. In this paper, we consider the classical limit from a different perspective. We consider the set of properties of a quantum system and we study the quantum-to-classical transition of its logical structure. The aim is to open the door to a new study based on dynamical logics, that is, logics that change over time. In particular, we appeal to the notion of hybrid logics to describe semiclassical systems. Moreover, we consider systems with many characteristic decoherence times, whose sublattices of properties become distributive at different times.

  14. Boolean Logic Tree of Label-Free Dual-Signal Electrochemical Aptasensor System for Biosensing, Three-State Logic Computation, and Keypad Lock Security Operation.

    PubMed

    Lu, Jiao Yang; Zhang, Xin Xing; Huang, Wei Tao; Zhu, Qiu Yan; Ding, Xue Zhi; Xia, Li Qiu; Luo, Hong Qun; Li, Nian Bing

    2017-09-19

    The most serious and yet unsolved problems of molecular logic computing consist in how to connect molecular events in complex systems into a usable device with specific functions and how to selectively control branchy logic processes from the cascading logic systems. This report demonstrates that a Boolean logic tree is utilized to organize and connect "plug and play" chemical events DNA, nanomaterials, organic dye, biomolecule, and denaturant for developing the dual-signal electrochemical evolution aptasensor system with good resettability for amplification detection of thrombin, controllable and selectable three-state logic computation, and keypad lock security operation. The aptasensor system combines the merits of DNA-functionalized nanoamplification architecture and simple dual-signal electroactive dye brilliant cresyl blue for sensitive and selective detection of thrombin with a wide linear response range of 0.02-100 nM and a detection limit of 1.92 pM. By using these aforementioned chemical events as inputs and the differential pulse voltammetry current changes at different voltages as dual outputs, a resettable three-input biomolecular keypad lock based on sequential logic is established. Moreover, the first example of controllable and selectable three-state molecular logic computation with active-high and active-low logic functions can be implemented and allows the output ports to assume a high impediment or nothing (Z) state in addition to the 0 and 1 logic levels, effectively controlling subsequent branchy logic computation processes. Our approach is helpful in developing the advanced controllable and selectable logic computing and sensing system in large-scale integration circuits for application in biomedical engineering, intelligent sensing, and control.

  15. Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.

    PubMed

    Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny

    2010-12-01

    A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.

  16. N7 logic via patterning using templated DSA: implementation aspects

    NASA Astrophysics Data System (ADS)

    Bekaert, J.; Doise, J.; Gronheid, R.; Ryckaert, J.; Vandenberghe, G.; Fenger, G.; Her, Y. J.; Cao, Y.

    2015-07-01

    In recent years, major advancements have been made in the directed self-assembly (DSA) of block copolymers (BCP). Insertion of DSA for IC fabrication is seriously considered for the 7 nm node. At this node the DSA technology could alleviate costs for multiple patterning and limit the number of masks that would be required per layer. At imec, multiple approaches for inserting DSA into the 7 nm node are considered. One of the most straightforward approaches for implementation would be for via patterning through templated DSA; a grapho-epitaxy flow using cylindrical phase BCP material resulting in contact hole multiplication within a litho-defined pre-pattern. To be implemented for 7 nm node via patterning, not only the appropriate process flow needs to be available, but also DSA-aware mask decomposition is required. In this paper, several aspects of the imec approach for implementing templated DSA will be discussed, including experimental demonstration of density effect mitigation, DSA hole pattern transfer and double DSA patterning, creation of a compact DSA model. Using an actual 7 nm node logic layout, we derive DSA-friendly design rules in a logical way from a lithographer's view point. A concrete assessment is provided on how DSA-friendly design could potentially reduce the number of Via masks for a place-and-routed N7 logic pattern.

  17. Competing and coexisting logics in the changing field of English general medical practice.

    PubMed

    McDonald, Ruth; Cheraghi-Sohi, Sudeh; Bayes, Sara; Morriss, Richard; Kai, Joe

    2013-09-01

    Recent reforms, which change incentive and accountability structures in the English National Health Service, can be conceptualised as trying to shift the dominant institutional logic in the field of primary medical care (general medical practice) away from medical professionalism towards a logic of "population based medicine". This paper draws on interviews with primary care doctors, conducted during 2007-2009 and 2011-2012. It contrasts the approach of active management of populations, in line with recent reforms with responses to patients with medically unexplained symptoms. Our data suggest that rather than one logic becoming dominant, different dimensions of organisational activity reflect different logics. Although some aspects of organisational life are relatively untouched by the reforms, this is not due to 'resistance' on the part of staff within these organisations to attempts to 'control' them. We suggest that a more helpful way of understanding the data is to see these different aspects of work as governed by different institutional logics. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Fundamental physics issues of multilevel logic in developing a parallel processor.

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Anirban; Miki, Kazushi

    2007-06-01

    In the last century, On and Off physical switches, were equated with two decisions 0 and 1 to express every information in terms of binary digits and physically realize it in terms of switches connected in a circuit. Apart from memory-density increase significantly, more possible choices in particular space enables pattern-logic a reality, and manipulation of pattern would allow controlling logic, generating a new kind of processor. Neumann's computer is based on sequential logic, processing bits one by one. But as pattern-logic is generated on a surface, viewing whole pattern at a time is a truly parallel processing. Following Neumann's and Shannons fundamental thermodynamical approaches we have built compatible model based on series of single molecule based multibit logic systems of 4-12 bits in an UHV-STM. On their monolayer multilevel communication and pattern formation is experimentally verified. Furthermore, the developed intelligent monolayer is trained by Artificial Neural Network. Therefore fundamental weak interactions for the building of truly parallel processor are explored here physically and theoretically.

  19. Integrated System Modeling for Nuclear Thermal Propulsion (NTP)

    NASA Technical Reports Server (NTRS)

    Ryan, Stephen W.; Borowski, Stanley K.

    2014-01-01

    Nuclear thermal propulsion (NTP) has long been identified as a key enabling technology for space exploration beyond LEO. From Wernher Von Braun's early concepts for crewed missions to the Moon and Mars to the current Mars Design Reference Architecture (DRA) 5.0 and recent lunar and asteroid mission studies, the high thrust and specific impulse of NTP opens up possibilities such as reusability that are just not feasible with competing approaches. Although NTP technology was proven in the Rover / NERVA projects in the early days of the space program, an integrated spacecraft using NTP has never been developed. Such a spacecraft presents a challenging multidisciplinary systems integration problem. The disciplines that must come together include not only nuclear propulsion and power, but also thermal management, power, structures, orbital dynamics, etc. Some of this integration logic was incorporated into a vehicle sizing code developed at NASA's Glenn Research Center (GRC) in the early 1990s called MOMMA, and later into an Excel-based tool called SIZER. Recently, a team at GRC has developed an open source framework for solving Multidisciplinary Design, Analysis and Optimization (MDAO) problems called OpenMDAO. A modeling approach is presented that builds on previous work in NTP vehicle sizing and mission analysis by making use of the OpenMDAO framework to enable modular and reconfigurable representations of various NTP vehicle configurations and mission scenarios. This approach is currently applied to vehicle sizing, but is extensible to optimization of vehicle and mission designs. The key features of the code will be discussed and examples of NTP transfer vehicles and candidate missions will be presented.

  20. Property Specification Patterns for intelligence building software

    NASA Astrophysics Data System (ADS)

    Chun, Seungsu

    2018-03-01

    In this paper, through the property specification pattern research for Modal MU(μ) logical aspects present a single framework based on the pattern of intelligence building software. In this study, broken down by state property specification pattern classification of Dwyer (S) and action (A) and was subdivided into it again strong (A) and weaknesses (E). Through these means based on a hierarchical pattern classification of the property specification pattern analysis of logical aspects Mu(μ) was applied to the pattern classification of the examples used in the actual model checker. As a result, not only can a more accurate classification than the existing classification systems were easy to create and understand the attributes specified.

  1. Framework for analysis of guaranteed QOS systems

    NASA Astrophysics Data System (ADS)

    Chaudhry, Shailender; Choudhary, Alok

    1997-01-01

    Multimedia data is isochronous in nature and entails managing and delivering high volumes of data. Multiprocessors with their large processing power, vast memory, and fast interconnects, are an ideal candidate for the implementation of multimedia applications. Initially, multiprocessors were designed to execute scientific programs and thus their architecture was optimized to provide low message latency and efficiently support regular communication patterns. Hence, they have a regular network topology and most use wormhole routing. The design offers the benefits of a simple router, small buffer size, and network latency that is almost independent of path length. Among the various multimedia applications, video on demand (VOD) server is well-suited for implementation using parallel multiprocessors. Logical models for VOD servers are presently mapped onto multiprocessors. Our paper provides a framework for calculating bounds on utilization of system resources with which QoS parameters for each isochronous stream can be guaranteed. Effects of the architecture of multiprocessors, and efficiency of various local models and mapping on particular architectures can be investigated within our framework. Our framework is based on rigorous proofs and provides tight bounds. The results obtained may be used as the basis for admission control tests. To illustrate the versatility of our framework, we provide bounds on utilization for various logical models applied to mesh connected architectures for a video on demand server. Our results show that worm hole routing can lead to packets waiting for transmission of other packets that apparently share no common resources. This situation is analogous to head-of-the-line blocking. We find that the provision of multiple VCs per link and multiple flit buffers improves utilization (even under guaranteed QoS parameters). This analogous to parallel iterative matching.

  2. Intervention planning for a digital intervention for self-management of hypertension: a theory-, evidence- and person-based approach.

    PubMed

    Band, Rebecca; Bradbury, Katherine; Morton, Katherine; May, Carl; Michie, Susan; Mair, Frances S; Murray, Elizabeth; McManus, Richard J; Little, Paul; Yardley, Lucy

    2017-02-23

    This paper describes the intervention planning process for the Home and Online Management and Evaluation of Blood Pressure (HOME BP), a digital intervention to promote hypertension self-management. It illustrates how a Person-Based Approach can be integrated with theory- and evidence-based approaches. The Person-Based Approach to intervention development emphasises the use of qualitative research to ensure that the intervention is acceptable, persuasive, engaging and easy to implement. Our intervention planning process comprised two parallel, integrated work streams, which combined theory-, evidence- and person-based elements. The first work stream involved collating evidence from a mixed methods feasibility study, a systematic review and a synthesis of qualitative research. This evidence was analysed to identify likely barriers and facilitators to uptake and implementation as well as design features that should be incorporated in the HOME BP intervention. The second work stream used three complementary approaches to theoretical modelling: developing brief guiding principles for intervention design, causal modelling to map behaviour change techniques in the intervention onto the Behaviour Change Wheel and Normalisation Process Theory frameworks, and developing a logic model. The different elements of our integrated approach to intervention planning yielded important, complementary insights into how to design the intervention to maximise acceptability and ease of implementation by both patients and health professionals. From the primary and secondary evidence, we identified key barriers to overcome (such as patient and health professional concerns about side effects of escalating medication) and effective intervention ingredients (such as providing in-person support for making healthy behaviour changes). Our guiding principles highlighted unique design features that could address these issues (such as online reassurance and procedures for managing concerns). Causal modelling ensured that all relevant behavioural determinants had been addressed, and provided a complete description of the intervention. Our logic model linked the hypothesised mechanisms of action of our intervention to existing psychological theory. Our integrated approach to intervention development, combining theory-, evidence- and person-based approaches, increased the clarity, comprehensiveness and confidence of our theoretical modelling and enabled us to ground our intervention in an in-depth understanding of the barriers and facilitators most relevant to this specific intervention and user population.

  3. A Semantic Transformation Methodology for the Secondary Use of Observational Healthcare Data in Postmarketing Safety Studies.

    PubMed

    Pacaci, Anil; Gonul, Suat; Sinaci, A Anil; Yuksel, Mustafa; Laleci Erturkmen, Gokce B

    2018-01-01

    Background: Utilization of the available observational healthcare datasets is key to complement and strengthen the postmarketing safety studies. Use of common data models (CDM) is the predominant approach in order to enable large scale systematic analyses on disparate data models and vocabularies. Current CDM transformation practices depend on proprietarily developed Extract-Transform-Load (ETL) procedures, which require knowledge both on the semantics and technical characteristics of the source datasets and target CDM. Purpose: In this study, our aim is to develop a modular but coordinated transformation approach in order to separate semantic and technical steps of transformation processes, which do not have a strict separation in traditional ETL approaches. Such an approach would discretize the operations to extract data from source electronic health record systems, alignment of the source, and target models on the semantic level and the operations to populate target common data repositories. Approach: In order to separate the activities that are required to transform heterogeneous data sources to a target CDM, we introduce a semantic transformation approach composed of three steps: (1) transformation of source datasets to Resource Description Framework (RDF) format, (2) application of semantic conversion rules to get the data as instances of ontological model of the target CDM, and (3) population of repositories, which comply with the specifications of the CDM, by processing the RDF instances from step 2. The proposed approach has been implemented on real healthcare settings where Observational Medical Outcomes Partnership (OMOP) CDM has been chosen as the common data model and a comprehensive comparative analysis between the native and transformed data has been conducted. Results: Health records of ~1 million patients have been successfully transformed to an OMOP CDM based database from the source database. Descriptive statistics obtained from the source and target databases present analogous and consistent results. Discussion and Conclusion: Our method goes beyond the traditional ETL approaches by being more declarative and rigorous. Declarative because the use of RDF based mapping rules makes each mapping more transparent and understandable to humans while retaining logic-based computability. Rigorous because the mappings would be based on computer readable semantics which are amenable to validation through logic-based inference methods.

  4. A Grey Fuzzy Logic Approach for Cotton Fibre Selection

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Das, Partha Protim; Kumar, Vidyapati

    2017-06-01

    It is a well known fact that the quality of ring spun yarn predominantly depends on various physical properties of cotton fibre. Any variation in these fibre properties may affect the strength and unevenness of the final yarn. Thus, so as to achieve the desired yarn quality and characteristics, it becomes imperative for the spinning industry personnel to identify the most suitable cotton fibre from a set of feasible alternatives in presence of several conflicting properties/attributes. This cotton fibre selection process can be modelled as a Multi-Criteria Decision Making (MCDM) problem. In this paper, a grey fuzzy logic-based approach is proposed for selection of the most apposite cotton fibre from 17 alternatives evaluated based on six important fibre properties. It is observed that the preference order of the top-ranked cotton fibres derived using the grey fuzzy logic approach closely matches with that attained by the past researchers which proves the application potentiality of this method in solving varying MCDM problems in textile industries.

  5. A new approach to telemetry data processing. Ph.D. Thesis - Maryland Univ.

    NASA Technical Reports Server (NTRS)

    Broglio, C. J.

    1973-01-01

    An approach for a preprocessing system for telemetry data processing was developed. The philosophy of the approach is the development of a preprocessing system to interface with the main processor and relieve it of the burden of stripping information from a telemetry data stream. To accomplish this task, a telemetry preprocessing language was developed. Also, a hardware device for implementing the operation of this language was designed using a cellular logic module concept. In the development of the hardware device and the cellular logic module, a distributed form of control was implemented. This is accomplished by a technique of one-to-one intermodule communications and a set of privileged communication operations. By transferring this control state from module to module, the control function is dispersed through the system. A compiler for translating the preprocessing language statements into an operations table for the hardware device was also developed. Finally, to complete the system design and verify it, a simulator for the collular logic module was written using the APL/360 system.

  6. A guide to phylogenetic metrics for conservation, community ecology and macroecology.

    PubMed

    Tucker, Caroline M; Cadotte, Marc W; Carvalho, Silvia B; Davies, T Jonathan; Ferrier, Simon; Fritz, Susanne A; Grenyer, Rich; Helmus, Matthew R; Jin, Lanna S; Mooers, Arne O; Pavoine, Sandrine; Purschke, Oliver; Redding, David W; Rosauer, Dan F; Winter, Marten; Mazel, Florent

    2017-05-01

    The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub-disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub-disciplines hampers potential meta-analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo-diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo-diversity metrics based on their mathematical form within these three dimensions and identify 'anchor' representatives: for α-diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. © 2016 The Authors. Biological Reviews published by John Wiley © Sons Ltd on behalf of Cambridge Philosophical Society.

  7. Pragmatic service development and customisation with the CEDA OGC Web Services framework

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Stephens, Ag; Lowe, Dominic

    2010-05-01

    The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.

  8. Tailoring of the Tell-us Card communication tool for nurses to increase patient participation using Intervention Mapping.

    PubMed

    van Belle, Elise; Zwakhalen, Sandra M G; Caris, Josien; Van Hecke, Ann; Huisman-de Waal, Getty; Heinen, Maud

    2018-02-01

    To describe the tailoring of the Tell-us Card intervention for enhanced patient participation to the Dutch hospital setting using Intervention Mapping as a systematic approach. Even though patient participation is essential in any patient-to-nurse encounter, care plans often fail to take patients' preferences into account. The Tell-us Card intervention seems promising, but needs to be tailored and tested before implementation in a different setting or on large scale. Description of the Intervention Mapping framework to systematically tailor the Tell-us Card intervention to the Dutch hospital setting. Intervention Mapping consists of: (i) identification of the problem through needs assessment and determination of fit, based on patients and nurses interviews and focus group interviews; (ii) developing a logic model of change and matrices, based on literature and interviews; (iii) selection of theory-based methods and practical applications; (iv) producing programme components and piloting; (v) planning for adoption, implementation and sustainability; and (vi) preparing for programme evaluation. Knowledge, attitude, outcome expectations, self-efficacy and skills were identified as the main determinants influencing the use of the Tell-us Card. Linking identified determinants and performance objectives with behaviour change techniques from the literature resulted in a well-defined and tailored intervention and evaluation plan. The Tell-us Card intervention was adapted to fit the Dutch hospital setting and prepared for evaluation. The Medical Research Council framework was followed, and the Intervention Mapping approach was used to prepare a pilot study to confirm feasibility and relevant outcomes. This article shows how Intervention Mapping is applied within the Medical Research Council framework to adapt the Tell-us Card intervention, which could serve as a guide for the tailoring of similar interventions. © 2017 John Wiley & Sons Ltd.

  9. A guide to phylogenetic metrics for conservation, community ecology and macroecology

    PubMed Central

    Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent

    2016-01-01

    ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932

  10. Systematic development and implementation of interventions to OPtimise Health Literacy and Access (Ophelia).

    PubMed

    Beauchamp, Alison; Batterham, Roy W; Dodson, Sarity; Astbury, Brad; Elsworth, Gerald R; McPhee, Crystal; Jacobson, Jeanine; Buchbinder, Rachelle; Osborne, Richard H

    2017-03-03

    The need for healthcare strengthening to enhance equity is critical, requiring systematic approaches that focus on those experiencing lesser access and outcomes. This project developed and tested the Ophelia (OPtimising HEalth LIteracy and Access) approach for co-design of interventions to improve health literacy and equity of access. Eight principles guided this development: Outcomes focused; Equity driven, Needs diagnosis, Co-design, Driven by local wisdom, Sustainable, Responsive and Systematically applied. We report the application of the Ophelia process where proof-of-concept was defined as successful application of the principles. Nine sites were briefed on the aims of the project around health literacy, co-design and quality improvement. The sites were rural/metropolitan, small/large hospitals, community health centres or municipalities. Each site identified their own priorities for improvement; collected health literacy data using the Health Literacy Questionnaire (HLQ) within the identified priority groups; engaged staff in co-design workshops to generate ideas for improvement; developed program-logic models; and implemented their projects using Plan-Do-Study-Act (PDSA) cycles. Evaluation included assessment of impacts on organisations, practitioners and service users, and whether the principles were applied. Sites undertook co-design workshops involving discussion of service user needs informed by HLQ (n = 813) and interview data. Sites generated between 21 and 78 intervention ideas and then planned their selected interventions through program-logic models. Sites successfully implemented interventions and refined them progressively with PDSA cycles. Interventions generally involved one of four pathways: development of clinician skills and resources for health literacy, engagement of community volunteers to disseminate health promotion messages, direct impact on consumers' health literacy, and redesign of existing services. Evidence of application of the principles was found in all sites. The Ophelia approach guided identification of health literacy issues at each participating site and the development and implementation of locally appropriate solutions. The eight principles provided a framework that allowed flexible application of the Ophelia approach and generation of a diverse set of interventions. Changes were observed at organisational, staff, and community member levels. The Ophelia approach can be used to generate health service improvements that enhance health outcomes and address inequity of access to healthcare.

  11. Engineered modular biomaterial logic gates for environmentally triggered therapeutic delivery

    NASA Astrophysics Data System (ADS)

    Badeau, Barry A.; Comerford, Michael P.; Arakawa, Christopher K.; Shadish, Jared A.; Deforest, Cole A.

    2018-03-01

    The successful transport of drug- and cell-based therapeutics to diseased sites represents a major barrier in the development of clinical therapies. Targeted delivery can be mediated through degradable biomaterial vehicles that utilize disease biomarkers to trigger payload release. Here, we report a modular chemical framework for imparting hydrogels with precise degradative responsiveness by using multiple environmental cues to trigger reactions that operate user-programmable Boolean logic. By specifying the molecular architecture and connectivity of orthogonal stimuli-labile moieties within material cross-linkers, we show selective control over gel dissolution and therapeutic delivery. To illustrate the versatility of this methodology, we synthesized 17 distinct stimuli-responsive materials that collectively yielded all possible YES/OR/AND logic outputs from input combinations involving enzyme, reductant and light. Using these hydrogels we demonstrate the first sequential and environmentally stimulated release of multiple cell lines in well-defined combinations from a material. We expect these platforms will find utility in several diverse fields including drug delivery, diagnostics and regenerative medicine.

  12. Software Process Assurance for Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.

  13. Engineered modular biomaterial logic gates for environmentally triggered therapeutic delivery.

    PubMed

    Badeau, Barry A; Comerford, Michael P; Arakawa, Christopher K; Shadish, Jared A; DeForest, Cole A

    2018-03-01

    The successful transport of drug- and cell-based therapeutics to diseased sites represents a major barrier in the development of clinical therapies. Targeted delivery can be mediated through degradable biomaterial vehicles that utilize disease biomarkers to trigger payload release. Here, we report a modular chemical framework for imparting hydrogels with precise degradative responsiveness by using multiple environmental cues to trigger reactions that operate user-programmable Boolean logic. By specifying the molecular architecture and connectivity of orthogonal stimuli-labile moieties within material cross-linkers, we show selective control over gel dissolution and therapeutic delivery. To illustrate the versatility of this methodology, we synthesized 17 distinct stimuli-responsive materials that collectively yielded all possible YES/OR/AND logic outputs from input combinations involving enzyme, reductant and light. Using these hydrogels we demonstrate the first sequential and environmentally stimulated release of multiple cell lines in well-defined combinations from a material. We expect these platforms will find utility in several diverse fields including drug delivery, diagnostics and regenerative medicine.

  14. The logical foundations of forensic science: towards reliable knowledge

    PubMed Central

    Evett, Ian

    2015-01-01

    The generation of observations is a technical process and the advances that have been made in forensic science techniques over the last 50 years have been staggering. But science is about reasoning—about making sense from observations. For the forensic scientist, this is the challenge of interpreting a pattern of observations within the context of a legal trial. Here too, there have been major advances over recent years and there is a broad consensus among serious thinkers, both scientific and legal, that the logical framework is furnished by Bayesian inference (Aitken et al. Fundamentals of Probability and Statistical Evidence in Criminal Proceedings). This paper shows how the paradigm has matured, centred on the notion of the balanced scientist. Progress through the courts has not been always smooth and difficulties arising from recent judgments are discussed. Nevertheless, the future holds exciting prospects, in particular the opportunities for managing and calibrating the knowledge of the forensic scientists who assign the probabilities that are at the foundation of logical inference in the courtroom. PMID:26101288

  15. On Childhood and the Logic of Difference: Some Empirical Examples

    ERIC Educational Resources Information Center

    Dahlbeck, Johan

    2012-01-01

    This article argues that universal documents on children's rights can provide illustrative examples as to how childhood is identified as a unity using difference as an instrument. Using Gille Deleuze's theorising on difference and sameness as a framework, the article seeks to relate the children's rights project with a critique of representation.…

  16. Understanding Understanding Mathematics. Artificial Intelligence Memo No. 488.

    ERIC Educational Resources Information Center

    Michener, Edwina Rissland

    This document is concerned with the important extra-logical knowledge that is often outside of traditional discussions in mathematics, and looks at some of the ingredients and processes involved in the understanding of mathematics. The goal is to develop a conceptual framework in which to talk about mathematical knowledge and to understand the…

  17. Balancing Detailed Comprehensiveness with a Big Vision: A Suggested Conceptual Framework for Teacher Education Courses

    ERIC Educational Resources Information Center

    Ormond, Christine A.

    2012-01-01

    Current Australian teacher accreditation processes are impacting significantly on the expectations of teacher education courses, particularly in relation to graduate resilience, flexibility and capability. This paper uses a logical conceptual format to explain how writers at a Western Australian university prepared a new Secondary Degree course,…

  18. Identifying and Prioritizing Critical Hardwood Resources

    Treesearch

    Sam C. Doak; Sharon Johnson; Marlyce Myers

    1991-01-01

    A logical framework is required to provide a focus for the implementation of a variety of landowner incentive techniques in accordance with existing goals to protect and enhance hardwood resources. A system is presented for identifying and prioritizing critical hardwood resources for site specific conservation purposes. Flexibility is built into this system so that...

  19. Sense, Nonsense, and Violence: Levinas and the Internal Logic of School Shootings

    ERIC Educational Resources Information Center

    Keehn, Gabriel; Boyles, Deron

    2015-01-01

    Utilizing a broadly Levinasian framework, specifically the interplay among his ideas of possession, violence, and negation, Gabriel Keehn and Deron Boyles illustrate how the relatively recent sharp turn toward the hypercorporatized school and the concomitant transition of the student from simple (potential) customer to a type of hybrid…

  20. Muslim Schools in Britain: Challenging Mobilisations or Logical Developments?

    ERIC Educational Resources Information Center

    Meer, Nasar

    2007-01-01

    There are currently over 100 independent and seven state-funded Muslim schools in Britain yet their place within the British education system remains a hotly debated issue. This article argues that Muslim mobilisations for the institutional and financial incorporation of more Muslim schools into the national framework are best understood as an…

Top