Sample records for logic flowgraph methodology

  1. Development of a methodology for assessing the safety of embedded software systems

    NASA Technical Reports Server (NTRS)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  2. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  3. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  4. A flowgraph model for bladder carcinoma

    PubMed Central

    2014-01-01

    Background Superficial bladder cancer has been the subject of numerous studies for many years, but the evolution of the disease still remains not well understood. After the tumor has been surgically removed, it may reappear at a similar level of malignancy or progress to a higher level. The process may be reasonably modeled by means of a Markov process. However, in order to more completely model the evolution of the disease, this approach is insufficient. The semi-Markov framework allows a more realistic approach, but calculations become frequently intractable. In this context, flowgraph models provide an efficient approach to successfully manage the evolution of superficial bladder carcinoma. Our aim is to test this methodology in this particular case. Results We have built a successful model for a simple but representative case. Conclusion The flowgraph approach is suitable for modeling of superficial bladder cancer. PMID:25080066

  5. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  6. Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yau, M.; Motamed, M.; Guarro, S.

    2006-07-01

    Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less

  7. Flow-graph approach for optical analysis of planar structures.

    PubMed

    Minkov, D

    1994-11-20

    The flow-graph approach (FGA) is applied to optical analysis of isotropic stratified planar structures (ISPS's) at inclined light incidence. Conditions for the presence of coherent and noncoherent light interaction within ISPS's are determined. Examples of the use of FGA for calculation of the transmission and the reflection of two-layer ISPS's for different types of light interaction are given. The advantages of the use of FGA for optical analysis of ISPS's are discussed.

  8. DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS

    DTIC Science & Technology

    2017-10-01

    DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ

  9. Application of Fuzzy Logic to Matrix FMECA

    NASA Astrophysics Data System (ADS)

    Shankar, N. Ravi; Prabhu, B. S.

    2001-04-01

    A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.

  10. Using logic models to enhance the methodological quality of primary health-care interventions: guidance from an intervention to promote nutrition care by general practitioners and practice nurses.

    PubMed

    Ball, Lauren; Ball, Dianne; Leveritt, Michael; Ray, Sumantra; Collins, Clare; Patterson, Elizabeth; Ambrosini, Gina; Lee, Patricia; Chaboyer, Wendy

    2017-04-01

    The methodological designs underpinning many primary health-care interventions are not rigorous. Logic models can be used to support intervention planning, implementation and evaluation in the primary health-care setting. Logic models provide a systematic and visual way of facilitating shared understanding of the rationale for the intervention, the planned activities, expected outcomes, evaluation strategy and required resources. This article provides guidance for primary health-care practitioners and researchers on the use of logic models for enhancing methodological rigour of interventions. The article outlines the recommended steps in developing a logic model using the 'NutriCare' intervention as an example. The 'NutriCare' intervention is based in the Australian primary health-care setting and promotes nutrition care by general practitioners and practice nurses. The recommended approach involves canvassing the views of all stakeholders who have valuable and informed opinions about the planned project. The following four targeted, iterative steps are recommended: (1) confirm situation, intervention aim and target population; (2) document expected outcomes and outputs of the intervention; (3) identify and describe assumptions, external factors and inputs; and (4) confirm intervention components. Over a period of 2 months, three primary health-care researchers and one health-services consultant led the collaborative development of the 'NutriCare' logic model. Primary health-care practitioners and researchers are encouraged to develop a logic model when planning interventions to maximise the methodological rigour of studies, confirm that data required to answer the question are captured and ensure that the intervention meets the project goals.

  11. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Logic Design Pathology and Space Flight Electronics

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Barto, Rod L.; Erickson, K.

    1997-01-01

    Logic design errors have been observed in space flight missions and the final stages of ground test. The technologies used by designers and their design/analysis methodologies will be analyzed. This will give insight to the root causes of the failures. These technologies include discrete integrated circuit based systems, systems based on field and mask programmable logic, and the use computer aided engineering (CAE) systems. State-of-the-art (SOTA) design tools and methodologies will be analyzed with respect to high-reliability spacecraft design and potential pitfalls are discussed. Case studies of faults from large expensive programs to "smaller, faster, cheaper" missions will be used to explore the fundamental reasons for logic design problems.

  13. Quantum dot ternary-valued full-adder: Logic synthesis by a multiobjective design optimization based on a genetic algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klymenko, M. V.; Remacle, F., E-mail: fremacle@ulg.ac.be

    2014-10-28

    A methodology is proposed for designing a low-energy consuming ternary-valued full adder based on a quantum dot (QD) electrostatically coupled with a single electron transistor operating as a charge sensor. The methodology is based on design optimization: the values of the physical parameters of the system required for implementing the logic operations are optimized using a multiobjective genetic algorithm. The searching space is determined by elements of the capacitance matrix describing the electrostatic couplings in the entire device. The objective functions are defined as the maximal absolute error over actual device logic outputs relative to the ideal truth tables formore » the sum and the carry-out in base 3. The logic units are implemented on the same device: a single dual-gate quantum dot and a charge sensor. Their physical parameters are optimized to compute either the sum or the carry out outputs and are compatible with current experimental capabilities. The outputs are encoded in the value of the electric current passing through the charge sensor, while the logic inputs are supplied by the voltage levels on the two gate electrodes attached to the QD. The complex logic ternary operations are directly implemented on an extremely simple device, characterized by small sizes and low-energy consumption compared to devices based on switching single-electron transistors. The design methodology is general and provides a rational approach for realizing non-switching logic operations on QD devices.« less

  14. Integrated payload and mission planning, phase 3. Volume 2: Logic/Methodology for preliminary grouping of spacelab and mixed cargo payloads

    NASA Technical Reports Server (NTRS)

    Rodgers, T. E.; Johnson, J. F.

    1977-01-01

    The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.

  15. A methodology to migrate the gene ontology to a description logic environment using DAML+OIL.

    PubMed

    Wroe, C J; Stevens, R; Goble, C A; Ashburner, M

    2003-01-01

    The Gene Ontology Next Generation Project (GONG) is developing a staged methodology to evolve the current representation of the Gene Ontology into DAML+OIL in order to take advantage of the richer formal expressiveness and the reasoning capabilities of the underlying description logic. Each stage provides a step level increase in formal explicit semantic content with a view to supporting validation, extension and multiple classification of the Gene Ontology. The paper introduces DAML+OIL and demonstrates the activity within each stage of the methodology and the functionality gained.

  16. Misconceived Relationships between Logical Positivism and Quantitative Research: An Analysis in the Framework of Ian Hacking.

    ERIC Educational Resources Information Center

    Yu, Chong Ho

    Although quantitative research methodology is widely applied by psychological researchers, there is a common misconception that quantitative research is based on logical positivism. This paper examines the relationship between quantitative research and eight major notions of logical positivism: (1) verification; (2) pro-observation; (3)…

  17. A new method for qualitative simulation of water resources systems: 1. Theory

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.

    1987-11-01

    A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.

  18. Fuzzy logic controllers for electrotechnical devices - On-site tuning approach

    NASA Astrophysics Data System (ADS)

    Hissel, D.; Maussion, P.; Faucher, J.

    2001-12-01

    Fuzzy logic offers nowadays an interesting alternative to the designers of non linear control laws for electrical or electromechanical systems. However, due to the huge number of tuning parameters, this kind of control is only used in a few industrial applications. This paper proposes a new, very simple, on-site tuning strategy for a PID-like fuzzy logic controller. Thanks to the experimental designs methodology, we will propose sets of optimized pre-established settings for this kind of fuzzy controllers. The proposed parameters are only depending on one on-site open-loop identification test. In this way, this on-site tuning methodology has to be compared to the Ziegler-Nichols one's for conventional controllers. Experimental results (on a permanent magnets synchronous motor and on a DC/DC converter) will underline all the efficiency of this tuning methodology. Finally, the field of validity of the proposed pre-established settings will be given.

  19. The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.

    PubMed

    Jobe, Thomas H.; Helgason, Cathy M.

    1998-04-01

    Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.

  20. Conceptual Modeling via Logic Programming

    DTIC Science & Technology

    1990-01-01

    Define User Interface and Query Language L i1W= Ltl k.l 4. Define Procedures for Specifying Output S . Select Logic Programming Language 6. Develop ...baseline s change model. sessions and baselines. It was changed 6. Develop Methodology for C 31 Users. considerably with the advent of the window This...Model Development : Implica- for Conceptual Modeling Via Logic tions for Communications of a Cognitive Programming. Marina del Rey, Calif.: Analysis of

  1. Logical Modeling and Dynamical Analysis of Cellular Networks

    PubMed Central

    Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine

    2016-01-01

    The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434

  2. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  3. Critical Analysis of the Mathematical Formalism of Theoretical Physics. II. Foundations of Vector Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2014-03-01

    A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.

  4. Fuzzy logic modeling of high performance rechargeable batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, P.; Fennie, C. Jr.; Reisner, D.E.

    1998-07-01

    Accurate battery state-of-charge (SOC) measurements are critical in many portable electronic device applications. Yet conventional techniques for battery SOC estimation are limited in their accuracy, reliability, and flexibility. In this paper the authors present a powerful new approach to estimate battery SOC using a fuzzy logic-based methodology. This approach provides a universally applicable, accurate method for battery SOC estimation either integrated within, or as an external monitor to, an electronic device. The methodology is demonstrated in modeling impedance measurements on Ni-MH cells and discharge voltage curves of Li-ion cells.

  5. A Longitudinal Study of the Effects of Undergraduate Training on Reasoning.

    ERIC Educational Resources Information Center

    Lehman, Darrin R.; Nisbett, Richard E.

    1990-01-01

    Effects of undergraduate training on inductive reasoning and logic were examined. Social science training produced significant effects on statistical and methodological reasoning. Natural science and humanities training produced significant effects on conditional logic reasoning. Results indicate that reasoning is taught and generalizable. (BC)

  6. The Logic of Evaluation.

    ERIC Educational Resources Information Center

    Welty, Gordon A.

    The logic of the evaluation of educational and other action programs is discussed from a methodological viewpoint. However, no attempt is made to develop methods of evaluating programs. In Part I, the structure of an educational program is viewed as a system with three components--inputs, transformation of inputs into outputs, and outputs. Part II…

  7. (E)pistemological Awareness, Instantiation of Methods, and Uninformed Methodological Ambiguity in Qualitative Research Projects

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Yendol-Hoppey, Diane; Smith, Jason Jude; Hayes, Sharon B.

    2009-01-01

    This article explores epistemological awareness and instantiation of methods, as well as uninformed ambiguity, in qualitative methodological decision making and research reporting. The authors argue that efforts should be made to make the research process, epistemologies, values, methodological decision points, and argumentative logic open,…

  8. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shevitz, Daniel W; O' Brien, David A; Zerkle, David K

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definitemore » need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.« less

  9. Logic regression and its extensions.

    PubMed

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. The logic of syntactic priming and acceptability judgments.

    PubMed

    Gaston, Phoebe; Huang, Nick; Phillips, Colin

    2017-01-01

    A critical flaw in Branigan & Pickering's (B&P's) advocacy of structural priming is the absence of a theory of priming. This undermines their claims about the value of priming as a methodology. In contrast, acceptability judgments enable clearer inferences about structure. It is important to engage thoroughly with the logic behind different structural diagnostics.

  11. Brain Activity Associated with Logical Inferences in Geometry: Focusing on Students with Different Levels of Ability

    ERIC Educational Resources Information Center

    Waisman, Ilana; Leikin, Mark; Leikin, Roza

    2016-01-01

    Mathematical processing associated with solving short geometry problems requiring logical inference was examined among students who differ in their levels of general giftedness (G) and excellence in mathematics (EM) using ERP research methodology. Sixty-seven male adolescents formed four major research groups designed according to various…

  12. Logic-based models in systems biology: a predictive and parameter-free network analysis method†

    PubMed Central

    Wynn, Michelle L.; Consul, Nikita; Merajver, Sofia D.

    2012-01-01

    Highly complex molecular networks, which play fundamental roles in almost all cellular processes, are known to be dysregulated in a number of diseases, most notably in cancer. As a consequence, there is a critical need to develop practical methodologies for constructing and analysing molecular networks at a systems level. Mathematical models built with continuous differential equations are an ideal methodology because they can provide a detailed picture of a network’s dynamics. To be predictive, however, differential equation models require that numerous parameters be known a priori and this information is almost never available. An alternative dynamical approach is the use of discrete logic-based models that can provide a good approximation of the qualitative behaviour of a biochemical system without the burden of a large parameter space. Despite their advantages, there remains significant resistance to the use of logic-based models in biology. Here, we address some common concerns and provide a brief tutorial on the use of logic-based models, which we motivate with biological examples. PMID:23072820

  13. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  14. Dewey's Logic as a Methodological Grounding Point for Practitioner-Based Inquiry

    ERIC Educational Resources Information Center

    Demetrion, George

    2012-01-01

    The purpose of this essay is to draw out key insights from Dewey's important text "Logic: The Theory of Inquiry" to provide theoretical and practical support for the emergent field of teacher research. The specific focal point is the argument in Cochran-Smith and Lytle's "Inside/Outside: Teacher Research and Knowledge" on the significance of…

  15. Developmental and Methodological Issues in the Growth of Logical Thinking in Adolescence.

    ERIC Educational Resources Information Center

    Weybright, Loren Dean

    The 30 sixth and 30 ninth grade students, all of whom attended one rural school district in central Illinois, not only displayed a wider variety of behaviors than reported in the Inhelder and Piaget study (1958) investigating the development of logical thinking in children, but the behavioral components served to fill significant gaps in the…

  16. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  17. Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems

    NASA Technical Reports Server (NTRS)

    Song, Lixia; Kuchar, James K.

    2003-01-01

    Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.

  18. The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilch, Martin M.

    Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less

  19. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  20. Design and implementation of fuzzy logic controllers. Thesis Final Report, 27 Jul. 1992 - 1 Jan. 1993

    NASA Technical Reports Server (NTRS)

    Abihana, Osama A.; Gonzalez, Oscar R.

    1993-01-01

    The main objectives of our research are to present a self-contained overview of fuzzy sets and fuzzy logic, develop a methodology for control system design using fuzzy logic controllers, and to design and implement a fuzzy logic controller for a real system. We first present the fundamental concepts of fuzzy sets and fuzzy logic. Fuzzy sets and basic fuzzy operations are defined. In addition, for control systems, it is important to understand the concepts of linguistic values, term sets, fuzzy rule base, inference methods, and defuzzification methods. Second, we introduce a four-step fuzzy logic control system design procedure. The design procedure is illustrated via four examples, showing the capabilities and robustness of fuzzy logic control systems. This is followed by a tuning procedure that we developed from our design experience. Third, we present two Lyapunov based techniques for stability analysis. Finally, we present our design and implementation of a fuzzy logic controller for a linear actuator to be used to control the direction of the Free Flight Rotorcraft Research Vehicle at LaRC.

  1. Fuzzy logic applied to prospecting for areas for installation of wood panel industries.

    PubMed

    Dos Santos, Alexandre Rosa; Paterlini, Ewerthon Mattos; Fiedler, Nilton Cesar; Ribeiro, Carlos Antonio Alvares Soares; Lorenzon, Alexandre Simões; Domingues, Getulio Fonseca; Marcatti, Gustavo Eduardo; de Castro, Nero Lemos Martins; Teixeira, Thaisa Ribeiro; Dos Santos, Gleissy Mary Amaral Dino Alves; Juvanhol, Ronie Silva; Branco, Elvis Ricardo Figueira; Mota, Pedro Henrique Santos; da Silva, Lilianne Gomes; Pirovani, Daiani Bernardo; de Jesus, Waldir Cintra; Santos, Ana Carolina de Albuquerque; Leite, Helio Garcia; Iwakiri, Setsuo

    2017-05-15

    Prospecting for suitable areas for forestry operations, where the objective is a reduction in production and transportation costs, as well as the maximization of profits and available resources, constitutes an optimization problem. However, fuzzy logic is an alternative method for solving this problem. In the context of prospecting for suitable areas for the installation of wood panel industries, we propose applying fuzzy logic analysis for simulating the planting of different species and eucalyptus hybrids in Espírito Santo State, Brazil. The necessary methodological steps for this study are as follows: a) agriclimatological zoning of different species and eucalyptus hybrids; b) the selection of the vector variables; c) the application of the Euclidean distance to the vector variables; d) the application of fuzzy logic to matrix variables of the Euclidean distance; and e) the application of overlap fuzzy logic to locate areas for installation of wood panel industries. Among all the species and hybrids, Corymbia citriodora showed the highest percentage values for the combined very good and good classes, with 8.60%, followed by Eucalyptus grandis with 8.52%, Eucalyptus urophylla with 8.35% and Urograndis with 8.34%. The fuzzy logic analysis afforded flexibility in prospecting for suitable areas for the installation of wood panel industries in the Espírito Santo State can bring great economic and social benefits to the local population with the generation of jobs, income, tax revenues and GDP increase for the State and municipalities involved. The proposed methodology can be adapted to other areas and agricultural crops. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Problem Solving in Biology: A Methodology

    ERIC Educational Resources Information Center

    Wisehart, Gary; Mandell, Mark

    2008-01-01

    A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…

  3. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  4. An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1993-01-01

    This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.

  5. Fuzzy Logic-Based Audio Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Malcangi, M.

    2008-11-01

    Audio and audio-pattern recognition is becoming one of the most important technologies to automatically control embedded systems. Fuzzy logic may be the most important enabling methodology due to its ability to rapidly and economically model such application. An audio and audio-pattern recognition engine based on fuzzy logic has been developed for use in very low-cost and deeply embedded systems to automate human-to-machine and machine-to-machine interaction. This engine consists of simple digital signal-processing algorithms for feature extraction and normalization, and a set of pattern-recognition rules manually tuned or automatically tuned by a self-learning process.

  6. Actualizacion linguistica, AL-1 (Current Linguistics, AL-1).

    ERIC Educational Resources Information Center

    Penaloza, Miguel

    This document, the first in a series called "Actualizacion Linguistica," seeks to establish the bases for testing a new methodology for teaching Spanish to Colombia beginning at the preschool and primary levels. The methodology initially uses a system of "logic blocks" of differing size, color, shape, and weight to devise games…

  7. Evaluation of Model-Based Training for Vertical Guidance Logic

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.

  8. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  9. Generalized serial search code acquisition - The equivalent circular state diagram approach

    NASA Technical Reports Server (NTRS)

    Polydoros, A.; Simon, M. K.

    1984-01-01

    A transform-domain method for deriving the generating function of the acquisition process resulting from an arbitrary serial search strategy is presented. The method relies on equivalent circular state diagrams, uses Mason's formula from flow-graph theory, and employs a minimum number of required parameters. The transform-domain approach is briefly described and the concept of equivalent circular state diagrams is introduced and exploited to derive the generating function and resulting mean acquisition time for three particular cases of interest, the continuous/center Z search, the broken/center Z search, and the expanding window search. An optimization of the latter technique is performed whereby the number of partial windows which minimizes the mean acquisition time is determined. The numerical results satisfy certain intuitive predictions and provide useful design guidelines for such systems.

  10. Measuring Structural Gender Equality in Mexico: A State Level Analysis

    ERIC Educational Resources Information Center

    Frias, Sonia M.

    2008-01-01

    The main goal of this article is to assess the level of gender equality across the 32 Mexican states. After reviewing conceptual and methodological issues related to previous measures of structural inequality I detail the logic and methodology involved in the construction of a composite and multidimensional measure of gender equality, at the…

  11. Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach

    ERIC Educational Resources Information Center

    Stevenson, Glenn A.

    2012-01-01

    For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…

  12. Practicing the Four Seasons of Ethnography Methodology while Searching for Identity in Mexico

    ERIC Educational Resources Information Center

    Pitts, Margaret Jane

    2012-01-01

    This narrative is an account of my field experiences and challenges practicing Gonzalez's (2000) Four Seasons of Ethnography methodology in Mexico City. I describe the complexities and tensions inherent in managing two scientific paradigms: Western scientific logic vs. a more organic ontology. The experiential knowledge produced in this text is…

  13. A Physics-Based Engineering Methodology for Calculating Soft Error Rates of Bulk CMOS and SiGe Heterojunction Bipolar Transistor Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Fulkerson, David E.

    2010-02-01

    This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.

  14. Using the Abstraction Network in Complement to Description Logics for Quality Assurance in Biomedical Terminologies - A Case Study in SNOMED CT

    PubMed Central

    Wei, Duo; Bodenreider, Olivier

    2015-01-01

    Objectives To investigate errors identified in SNOMED CT by human reviewers with help from the Abstraction Network methodology and examine why they had escaped detection by the Description Logic (DL) classifier. Case study; Two examples of errors are presented in detail (one missing IS-A relation and one duplicate concept). After correction, SNOMED CT is reclassified to ensure that no new inconsistency was introduced. Conclusions DL-based auditing techniques built in terminology development environments ensure the logical consistency of the terminology. However, complementary approaches are needed for identifying and addressing other types of errors. PMID:20841848

  15. Using the abstraction network in complement to description logics for quality assurance in biomedical terminologies - a case study in SNOMED CT.

    PubMed

    Wei, Duo; Bodenreider, Olivier

    2010-01-01

    To investigate errors identified in SNOMED CT by human reviewers with help from the Abstraction Network methodology and examine why they had escaped detection by the Description Logic (DL) classifier. Case study; Two examples of errors are presented in detail (one missing IS-A relation and one duplicate concept). After correction, SNOMED CT is reclassified to ensure that no new inconsistency was introduced. DL-based auditing techniques built in terminology development environments ensure the logical consistency of the terminology. However, complementary approaches are needed for identifying and addressing other types of errors.

  16. A controlled genetic algorithm by fuzzy logic and belief functions for job-shop scheduling.

    PubMed

    Hajri, S; Liouane, N; Hammadi, S; Borne, P

    2000-01-01

    Most scheduling problems are highly complex combinatorial problems. However, stochastic methods such as genetic algorithm yield good solutions. In this paper, we present a controlled genetic algorithm (CGA) based on fuzzy logic and belief functions to solve job-shop scheduling problems. For better performance, we propose an efficient representational scheme, heuristic rules for creating the initial population, and a new methodology for mixing and computing genetic operator probabilities.

  17. A biomimetic colorimetric logic gate system based on multi-functional peptide-mediated gold nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Li, Yong; Li, Wang; He, Kai-Yu; Li, Pei; Huang, Yan; Nie, Zhou; Yao, Shou-Zhuo

    2016-04-01

    In natural biological systems, proteins exploit various functional peptide motifs to exert target response and activity switch, providing a functional and logic basis for complex cellular activities. Building biomimetic peptide-based bio-logic systems is highly intriguing but remains relatively unexplored due to limited logic recognition elements and complex signal outputs. In this proof-of-principle work, we attempted to address these problems by utilizing multi-functional peptide probes and the peptide-mediated nanoparticle assembly system. Here, the rationally designed peptide probes function as the dual-target responsive element specifically responsive to metal ions and enzymes as well as the mediator regulating the assembly of gold nanoparticles (AuNPs). Taking advantage of Zn2+ ions and chymotrypsin as the model inputs of metal ions and enzymes, respectively, we constructed the peptide logic system computed by the multi-functional peptide probes and outputted by the readable colour change of AuNPs. In this way, the representative binary basic logic gates (AND, OR, INHIBIT, NAND, IMPLICATION) have been achieved by delicately coding the peptide sequence, demonstrating the versatility of our logic system. Additionally, we demonstrated that the three-input combinational logic gate (INHIBIT-OR) could also be successfully integrated and applied as a multi-tasking biosensor for colorimetric detection of dual targets. This nanoparticle-based peptide logic system presents a valid strategy to illustrate peptide information processing and provides a practical platform for executing peptide computing or peptide-related multiplexing sensing, implying that the controllable nanomaterial assembly is a promising and potent methodology for the advancement of biomimetic bio-logic computation.In natural biological systems, proteins exploit various functional peptide motifs to exert target response and activity switch, providing a functional and logic basis for complex cellular activities. Building biomimetic peptide-based bio-logic systems is highly intriguing but remains relatively unexplored due to limited logic recognition elements and complex signal outputs. In this proof-of-principle work, we attempted to address these problems by utilizing multi-functional peptide probes and the peptide-mediated nanoparticle assembly system. Here, the rationally designed peptide probes function as the dual-target responsive element specifically responsive to metal ions and enzymes as well as the mediator regulating the assembly of gold nanoparticles (AuNPs). Taking advantage of Zn2+ ions and chymotrypsin as the model inputs of metal ions and enzymes, respectively, we constructed the peptide logic system computed by the multi-functional peptide probes and outputted by the readable colour change of AuNPs. In this way, the representative binary basic logic gates (AND, OR, INHIBIT, NAND, IMPLICATION) have been achieved by delicately coding the peptide sequence, demonstrating the versatility of our logic system. Additionally, we demonstrated that the three-input combinational logic gate (INHIBIT-OR) could also be successfully integrated and applied as a multi-tasking biosensor for colorimetric detection of dual targets. This nanoparticle-based peptide logic system presents a valid strategy to illustrate peptide information processing and provides a practical platform for executing peptide computing or peptide-related multiplexing sensing, implying that the controllable nanomaterial assembly is a promising and potent methodology for the advancement of biomimetic bio-logic computation. Electronic supplementary information (ESI) available: Additional figures (Tables S1-S3 and Fig. S1-S6). See DOI: 10.1039/c6nr01072e

  18. Linear Temporal Logic (LTL) Based Monitoring of Smart Manufacturing Systems.

    PubMed

    Heddy, Gerald; Huzaifa, Umer; Beling, Peter; Haimes, Yacov; Marvel, Jeremy; Weiss, Brian; LaViers, Amy

    2015-01-01

    The vision of Smart Manufacturing Systems (SMS) includes collaborative robots that can adapt to a range of scenarios. This vision requires a classification of multiple system behaviors, or sequences of movement, that can achieve the same high-level tasks. Likewise, this vision presents unique challenges regarding the management of environmental variables in concert with discrete, logic-based programming. Overcoming these challenges requires targeted performance and health monitoring of both the logical controller and the physical components of the robotic system. Prognostics and health management (PHM) defines a field of techniques and methods that enable condition-monitoring, diagnostics, and prognostics of physical elements, functional processes, overall systems, etc. PHM is warranted in this effort given that the controller is vulnerable to program changes, which propagate in unexpected ways, logical runtime exceptions, sensor failure, and even bit rot. The physical component's health is affected by the wear and tear experienced by machines constantly in motion. The controller's source of faults is inherently discrete, while the latter occurs in a manner that builds up continuously over time. Such a disconnect poses unique challenges for PHM. This paper presents a robotic monitoring system that captures and resolves this disconnect. This effort leverages supervisory robotic control and model checking with linear temporal logic (LTL), presenting them as a novel monitoring system for PHM. This methodology has been demonstrated in a MATLAB-based simulator for an industry inspired use-case in the context of PHM. Future work will use the methodology to develop adaptive, intelligent control strategies to evenly distribute wear on the joints of the robotic arms, maximizing the life of the system.

  19. Toward a Methodology for Evaluating the Impact of Technologies on Infantry Situation Awareness

    DTIC Science & Technology

    2004-10-01

    enhanced free - play exercise was conducted to investigate the effect of the ISR on SA. Ten Air Field Defence Guards participated in two vignettes, loosely...post- experiment interview), this result again seems logical. The DQT and free - play methodology was able to successfully discriminate between baseline...8 3.1.6 Free - play Vignettes

  20. Critical Reflections on Realist Review: Insights from Customizing the Methodology to the Needs of Participatory Research Assessment

    ERIC Educational Resources Information Center

    Jagosh, Justin; Pluye, Pierre; Wong, Geoff; Cargo, Margaret; Salsberg, Jon; Bush, Paula L.; Herbert, Carol P.; Green, Lawrence W.; Greenhalgh, Trish; Macaulay, Ann C.

    2014-01-01

    Realist review has increased in popularity as a methodology for complex intervention assessment. Our experience suggests that the process of designing a realist review requires its customization to areas under investigation. To elaborate on this idea, we first describe the logic underpinning realist review and then present critical reflections on…

  1. Logic-based assessment of the compatibility of UMLS ontology sources

    PubMed Central

    2011-01-01

    Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571

  2. Army Training Study: Battalion Training Survey. Volumes 1 and 2.

    DTIC Science & Technology

    1978-08-08

    mathematical logic in the methodology. II. MAGN ITUJDE-ESTI MAT ION SCALLING A. General Description A unique methodology, Magnitude-Estimation...to 142.) I b " p .’ . -, / 1 ’- " ’. " " . -’ -" ..’- ’ ;’ ’- . "’ .- ’,, • "." -- -. -. -.-. The base conditio (represen.d in T1- sIA , IIA, and IIIA

  3. Incompatible Systems of Logic: Why Design Should Integrate the Mechanistic, Reductionist, and Linear Logic of Military Detailed Planning

    DTIC Science & Technology

    2011-05-19

    24 Eva Boxenbaum, Linda Rouleau, New Knowledge Products as Bricolage : Metaphors and Scripts in Organizational Theory, (Academy of Management Review...marginalize it by reducing it into a supplement to detailed planning methodology. This process of knowledge production, defined as “ bricolage ” in...Boxenbaum, Rouleau, 280-281. A ‘bricoleur’ is a person that conducts ‘ bricolage ’ with new knowledge production. 48 Field Manual 5-0. 1-5, 1-6, 3-1

  4. Convolutional Neural Network on Embedded Linux(trademark) System-on-Chip: A Methodology and Performance Benchmark

    DTIC Science & Technology

    2016-05-01

    A9 CPU and 15 W for the i7 CPU. A method of accelerating this computation is by using a customized hardware unit called a field- programmable gate...implementation of custom logic to accelerate com- putational workloads. This FPGA fabric, in addition to the standard programmable logic, contains 220...chip; field- programmable gate array Daniel Gebhardt U U U U 18 (619) 553-2786 INITIAL DISTRIBUTION 84300 Library (2) 85300 Archive/Stock (1

  5. Convolutional Neural Network on Embedded Linux System-on-Chip: A Methodology and Performance Benchmark

    DTIC Science & Technology

    2016-05-01

    A9 CPU and 15 W for the i7 CPU. A method of accelerating this computation is by using a customized hardware unit called a field- programmable gate...implementation of custom logic to accelerate com- putational workloads. This FPGA fabric, in addition to the standard programmable logic, contains 220...chip; field- programmable gate array Daniel Gebhardt U U U U 18 (619) 553-2786 INITIAL DISTRIBUTION 84300 Library (2) 85300 Archive/Stock (1

  6. A biomimetic colorimetric logic gate system based on multi-functional peptide-mediated gold nanoparticle assembly.

    PubMed

    Li, Yong; Li, Wang; He, Kai-Yu; Li, Pei; Huang, Yan; Nie, Zhou; Yao, Shou-Zhuo

    2016-04-28

    In natural biological systems, proteins exploit various functional peptide motifs to exert target response and activity switch, providing a functional and logic basis for complex cellular activities. Building biomimetic peptide-based bio-logic systems is highly intriguing but remains relatively unexplored due to limited logic recognition elements and complex signal outputs. In this proof-of-principle work, we attempted to address these problems by utilizing multi-functional peptide probes and the peptide-mediated nanoparticle assembly system. Here, the rationally designed peptide probes function as the dual-target responsive element specifically responsive to metal ions and enzymes as well as the mediator regulating the assembly of gold nanoparticles (AuNPs). Taking advantage of Zn2+ ions and chymotrypsin as the model inputs of metal ions and enzymes, respectively, we constructed the peptide logic system computed by the multi-functional peptide probes and outputted by the readable colour change of AuNPs. In this way, the representative binary basic logic gates (AND, OR, INHIBIT, NAND, IMPLICATION) have been achieved by delicately coding the peptide sequence, demonstrating the versatility of our logic system. Additionally, we demonstrated that the three-input combinational logic gate (INHIBIT-OR) could also be successfully integrated and applied as a multi-tasking biosensor for colorimetric detection of dual targets. This nanoparticle-based peptide logic system presents a valid strategy to illustrate peptide information processing and provides a practical platform for executing peptide computing or peptide-related multiplexing sensing, implying that the controllable nanomaterial assembly is a promising and potent methodology for the advancement of biomimetic bio-logic computation.

  7. Linear Temporal Logic (LTL) Based Monitoring of Smart Manufacturing Systems

    PubMed Central

    Heddy, Gerald; Huzaifa, Umer; Beling, Peter; Haimes, Yacov; Marvel, Jeremy; Weiss, Brian; LaViers, Amy

    2017-01-01

    The vision of Smart Manufacturing Systems (SMS) includes collaborative robots that can adapt to a range of scenarios. This vision requires a classification of multiple system behaviors, or sequences of movement, that can achieve the same high-level tasks. Likewise, this vision presents unique challenges regarding the management of environmental variables in concert with discrete, logic-based programming. Overcoming these challenges requires targeted performance and health monitoring of both the logical controller and the physical components of the robotic system. Prognostics and health management (PHM) defines a field of techniques and methods that enable condition-monitoring, diagnostics, and prognostics of physical elements, functional processes, overall systems, etc. PHM is warranted in this effort given that the controller is vulnerable to program changes, which propagate in unexpected ways, logical runtime exceptions, sensor failure, and even bit rot. The physical component’s health is affected by the wear and tear experienced by machines constantly in motion. The controller’s source of faults is inherently discrete, while the latter occurs in a manner that builds up continuously over time. Such a disconnect poses unique challenges for PHM. This paper presents a robotic monitoring system that captures and resolves this disconnect. This effort leverages supervisory robotic control and model checking with linear temporal logic (LTL), presenting them as a novel monitoring system for PHM. This methodology has been demonstrated in a MATLAB-based simulator for an industry inspired use-case in the context of PHM. Future work will use the methodology to develop adaptive, intelligent control strategies to evenly distribute wear on the joints of the robotic arms, maximizing the life of the system. PMID:28730154

  8. A Mode of Combined ERP and KMS Knowledge Management System Construction

    NASA Astrophysics Data System (ADS)

    Yuena, Kang; Yangeng, Wen; Qun, Zhou

    The core of ERP and knowledge management is quite similar; both will send appropriate knowledge (goods, funds) to the right people (position) at the right time. It is reasonable to believe that increase the knowledge management system in ERP will help companies achieve their goals better. This paper compares the concept of logical point of hall three-dimensional structure of the knowledge management system and the ERP in methodology level. And found they are very similar in the time dimension, logic dimension and knowledge dimension. This laid the basis of methodology in the simultaneous planning, implementation and applications. And then proposed a knowledge-based ERP Multi-Agent Management System Model. Finally, the paper described the process from planning to implementation of knowledge management ERP system with multi-Agent interaction and impact from three concepts, management thinking, software and system.

  9. Human action quality evaluation based on fuzzy logic with application in underground coal mining.

    PubMed

    Ionica, Andreea; Leba, Monica

    2015-01-01

    The work system is defined by its components, their roles and the relationships between them. Any work system gravitates around the human resource and the interdependencies between human factor and the other components of it. Researches in this field agreed that the human factor and its actions are difficult to quantify and predict. The objective of this paper is to apply a method of human actions evaluation in order to estimate possible risks and prevent possible system faults, both at human factor level and at equipment level. In order to point out the importance of the human factor influence on all the elements of the working systems we propose a fuzzy logic based methodology for quality evaluation of human actions. This methodology has a multidisciplinary character, as it gathers ideas and methods from: quality management, ergonomics, work safety and artificial intelligence. The results presented refer to a work system with a high degree of specificity, namely, underground coal mining and are valuable for human resources risk evaluation pattern. The fuzzy logic evaluation of the human actions leads to early detection of possible dangerous evolutions of the work system and alarm the persons in charge.

  10. Design and implementation of the tree-based fuzzy logic controller.

    PubMed

    Liu, B D; Huang, C Y

    1997-01-01

    In this paper, a tree-based approach is proposed to design the fuzzy logic controller. Based on the proposed methodology, the fuzzy logic controller has the following merits: the fuzzy control rule can be extracted automatically from the input-output data of the system and the extraction process can be done in one-pass; owing to the fuzzy tree inference structure, the search spaces of the fuzzy inference process are largely reduced; the operation of the inference process can be simplified as a one-dimensional matrix operation because of the fuzzy tree approach; and the controller has regular and modular properties, so it is easy to be implemented by hardware. Furthermore, the proposed fuzzy tree approach has been applied to design the color reproduction system for verifying the proposed methodology. The color reproduction system is mainly used to obtain a color image through the printer that is identical to the original one. In addition to the software simulation, an FPGA is used to implement the prototype hardware system for real-time application. Experimental results show that the effect of color correction is quite good and that the prototype hardware system can operate correctly under the condition of 30 MHz clock rate.

  11. Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Tran, Minh D.

    2015-05-01

    Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowdy, M.W.; Couch, M.D.

    A vehicle comparison methodology based on the Otto-Engine Equivalent (OEE) vehicle concept is described. As an illustration of this methodology, the concept is used to make projections of the fuel economy potential of passenger cars using various alternative power systems. Sensitivities of OEE vehicle results to assumptions made in the calculational procedure are discussed. Factors considered include engine torque boundary, rear axle ratio, performance criteria, engine transient response, and transmission shift logic.

  13. Keys to Successful Implementation and Sustainment of Managed Maintenance for Healthcare Facilities

    DTIC Science & Technology

    2004-03-23

    second they involve studying those phenomena in all their complexity (Leedy and Ormrod, 2001). According to Denzin and Lincoln (1994), qualitative...people being studied (Leedy and Ormrod, 2001). Research Design Methodological Triangulation Denzin and Lincoln (1994) suggest because different...the setting. This dual view is refereed to as methodological triangulation ( Denzin and Lincoln , 1994). A research design develops a logical plan for

  14. Mapping the Mixed Methods–Mixed Research Synthesis Terrain

    PubMed Central

    Sandelowski, Margarete; Voils, Corrine I.; Leeman, Jennifer; Crandell, Jamie L.

    2012-01-01

    Mixed methods–mixed research synthesis is a form of systematic review in which the findings of qualitative and quantitative studies are integrated via qualitative and/or quantitative methods. Although methodological advances have been made, efforts to differentiate research synthesis methods have been too focused on methods and not focused enough on the defining logics of research synthesis—each of which may be operationalized in different ways—or on the research findings themselves that are targeted for synthesis. The conduct of mixed methods–mixed research synthesis studies may more usefully be understood in terms of the logics of aggregation and configuration. Neither logic is preferable to the other nor tied exclusively to any one method or to any one side of the qualitative/quantitative binary. PMID:23066379

  15. Complete all-optical processing polarization-based binary logic gates and optical processors.

    PubMed

    Zaghloul, Y A; Zaghloul, A R M

    2006-10-16

    We present a complete all-optical-processing polarization-based binary-logic system, by which any logic gate or processor can be implemented. Following the new polarization-based logic presented in [Opt. Express 14, 7253 (2006)], we develop a new parallel processing technique that allows for the creation of all-optical-processing gates that produce a unique output either logic 1 or 0 only once in a truth table, and those that do not. This representation allows for the implementation of simple unforced OR, AND, XOR, XNOR, inverter, and more importantly NAND and NOR gates that can be used independently to represent any Boolean expression or function. In addition, the concept of a generalized gate is presented which opens the door for reconfigurable optical processors and programmable optical logic gates. Furthermore, the new design is completely compatible with the old one presented in [Opt. Express 14, 7253 (2006)], and with current semiconductor based devices. The gates can be cascaded, where the information is always on the laser beam. The polarization of the beam, and not its intensity, carries the information. The new methodology allows for the creation of multiple-input-multiple-output processors that implement, by itself, any Boolean function, such as specialized or non-specialized microprocessors. Three all-optical architectures are presented: orthoparallel optical logic architecture for all known and unknown binary gates, singlebranch architecture for only XOR and XNOR gates, and the railroad (RR) architecture for polarization optical processors (POP). All the control inputs are applied simultaneously leading to a single time lag which leads to a very-fast and glitch-immune POP. A simple and easy-to-follow step-by-step algorithm is provided for the POP, and design reduction methodologies are briefly discussed. The algorithm lends itself systematically to software programming and computer-assisted design. As examples, designs of all binary gates, multiple-input gates, and sequential and non-sequential Boolean expressions are presented and discussed. The operation of each design is simply understood by a bullet train traveling at the speed of light on a railroad system preconditioned by the crossover states predetermined by the control inputs. The presented designs allow for optical processing of the information eliminating the need to convert it, back and forth, to an electronic signal for processing purposes. All gates with a truth table, including for example Fredkin, Toffoli, testable reversible logic, and threshold logic gates, can be designed and implemented using the railroad architecture. That includes any future gates not known today. Those designs and the quantum gates are not discussed in this paper.

  16. The Methodology for Developing Mobile Agent Application for Ubiquitous Environment

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi

    A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.

  17. The Otto-engine-equivalent vehicle concept

    NASA Technical Reports Server (NTRS)

    Dowdy, M. W.; Couch, M. D.

    1978-01-01

    A vehicle comparison methodology based on the Otto-Engine Equivalent (OEE) vehicle concept is described. As an illustration of this methodology, the concept is used to make projections of the fuel economy potential of passenger cars using various alternative power systems. Sensitivities of OEE vehicle results to assumptions made in the calculational procedure are discussed. Factors considered include engine torque boundary, rear axle ratio, performance criteria, engine transient response, and transmission shift logic.

  18. [A functional analysis of healthcare auditors' skills in Venezuela, 2008].

    PubMed

    Chirinos-Muñoz, Mónica S

    2010-10-01

    Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.

  19. [Research on the Application of Fuzzy Logic to Systems Analysis and Control

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Research conducted with the support of NASA Grant NCC2-275 has been focused in the main on the development of fuzzy logic and soft computing methodologies and their applications to systems analysis and control. with emphasis 011 problem areas which are of relevance to NASA's missions. One of the principal results of our research has been the development of a new methodology called Computing with Words (CW). Basically, in CW words drawn from a natural language are employed in place of numbers for computing and reasoning. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW.

  20. Reconciling pairs of concurrently used clinical practice guidelines using Constraint Logic Programming.

    PubMed

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.

  1. Tracking and Control of Gas Turbine Engine Component Damage/Life

    NASA Technical Reports Server (NTRS)

    Jaw, Link C.; Wu, Dong N.; Bryg, David J.

    2003-01-01

    This paper describes damage mechanisms and the methods of controlling damages to extend the on-wing life of critical gas turbine engine components. Particularly, two types of damage mechanisms are discussed: creep/rupture and thermo-mechanical fatigue. To control these damages and extend the life of engine hot-section components, we have investigated two methodologies to be implemented as additional control logic for the on-board electronic control unit. This new logic, the life-extending control (LEC), interacts with the engine control and monitoring unit and modifies the fuel flow to reduce component damages in a flight mission. The LEC methodologies were demonstrated in a real-time, hardware-in-the-loop simulation. The results show that LEC is not only a new paradigm for engine control design, but also a promising technology for extending the service life of engine components, hence reducing the life cycle cost of the engine.

  2. Improving Unipolar Resistive Switching Uniformity with Cone-Shaped Conducting Filaments and Its Logic-In-Memory Application.

    PubMed

    Gao, Shuang; Liu, Gang; Chen, Qilai; Xue, Wuhong; Yang, Huali; Shang, Jie; Chen, Bin; Zeng, Fei; Song, Cheng; Pan, Feng; Li, Run-Wei

    2018-02-21

    Resistive random access memory (RRAM) with inherent logic-in-memory capability exhibits great potential to construct beyond von-Neumann computers. Particularly, unipolar RRAM is more promising because its single polarity operation enables large-scale crossbar logic-in-memory circuits with the highest integration density and simpler peripheral control circuits. However, unipolar RRAM usually exhibits poor switching uniformity because of random activation of conducting filaments and consequently cannot meet the strict uniformity requirement for logic-in-memory application. In this contribution, a new methodology that constructs cone-shaped conducting filaments by using chemically a active metal cathode is proposed to improve unipolar switching uniformity. Such a peculiar metal cathode will react spontaneously with the oxide switching layer to form an interfacial layer, which together with the metal cathode itself can act as a load resistor to prevent the overgrowth of conducting filaments and thus make them more cone-like. In this way, the rupture of conducting filaments can be strictly limited to the tip region, making their residual parts favorable locations for subsequent filament growth and thus suppressing their random regeneration. As such, a novel "one switch + one unipolar RRAM cell" hybrid structure is capable to realize all 16 Boolean logic functions for large-scale logic-in-memory circuits.

  3. Conceptual and logical level of database modeling

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2016-06-01

    Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.

  4. A novel architecture of non-volatile magnetic arithmetic logic unit using magnetic tunnel junctions

    NASA Astrophysics Data System (ADS)

    Guo, Wei; Prenat, Guillaume; Dieny, Bernard

    2014-04-01

    Complementary metal-oxide-semiconductor (CMOS) technology is facing increasingly difficult obstacles such as power consumption and interconnection delay. Novel hybrid technologies and architectures are being investigated with the aim to circumvent some of these limits. In particular, hybrid CMOS/magnetic technology based on magnetic tunnel junctions (MTJs) is considered as a very promising approach thanks to the full compatibility of MTJs with CMOS technology. By tightly merging the conventional electronics with magnetism, both logic and memory functions can be implemented in the same device. As a result, non-volatility is directly brought into logic circuits, yielding significant improvement of device performances and new functionalities as well. We have conceived an innovative methodology to construct non-volatile magnetic arithmetic logic units (MALUs) combining spin-transfer torque MTJs with MOS transistors. The present 4-bit MALU utilizes 4 MTJ pairs to store its operation code (opcode). Its operations and performances have been confirmed and evaluated through electrical simulations.

  5. Freight Transportation Energy Use : Volume 2. Methodology and Program Documentation.

    DOT National Transportation Integrated Search

    1978-07-01

    The structure and logic of the transportation network model component of the TSC Freight Energy Model are presented. The model assigns given origin-destination commodity flows to specific transport modes and routes, thereby determining the traffic lo...

  6. Approximate Reasoning: Past, Present, Future

    DTIC Science & Technology

    1990-06-27

    This note presents a personal view of the state of the art in the representation and manipulation of imprecise and uncertain information by automated ... processing systems. To contrast their objectives and characteristics with the sound deductive procedures of classical logic, methodologies developed

  7. Site selection for MSFC operational tests of solar heating and cooling systems

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The criteria, methodology, and sequence aspects of the site selection process are presented. This report organized the logical thought process that should be applied to the site selection process, but final decisions are highly selective.

  8. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  9. Multi-criteria decision assessments using Subjective Logic: Methodology and the case of urban water strategies

    NASA Astrophysics Data System (ADS)

    Moglia, Magnus; Sharma, Ashok K.; Maheepala, Shiroma

    2012-07-01

    SummaryPlanning of regional and urban water resources, and in particular with Integrated Urban Water Management approaches, often considers inter-relationships between human uses of water, the health of the natural environment as well as the cost of various management strategies. Decision makers hence typically need to consider a combination of social, environmental and economic goals. The types of strategies employed can include water efficiency measures, water sensitive urban design, stormwater management, or catchment management. Therefore, decision makers need to choose between different scenarios and to evaluate them against a number of criteria. This type of problem has a discipline devoted to it, i.e. Multi-Criteria Decision Analysis, which has often been applied in water management contexts. This paper describes the application of Subjective Logic in a basic Bayesian Network to a Multi-Criteria Decision Analysis problem. By doing this, it outlines a novel methodology that explicitly incorporates uncertainty and information reliability. The application of the methodology to a known case study context allows for exploration. By making uncertainty and reliability of assessments explicit, it allows for assessing risks of various options, and this may help in alleviating cognitive biases and move towards a well formulated risk management policy.

  10. Risk methodology overview. [for carbon fiber release

    NASA Technical Reports Server (NTRS)

    Credeur, K. R.

    1979-01-01

    Some considerations of risk estimation, how risk is measured, and how risk analysis decisions are made are discussed. Specific problems of carbon fiber release are discussed by reviewing the objective, describing the main elements, and giving an example of the risk logic and outputs.

  11. Program logic: a framework for health program design and evaluation - the Pap nurse in general practice program.

    PubMed

    Hallinan, Christine M

    2010-01-01

    In this paper, program logic will be used to 'map out' the planning, development and evaluation of the general practice Pap nurse program in the Australian general practice arena. The incorporation of program logic into the evaluative process supports a greater appreciation of the theoretical assumptions and external influences that underpin general practice Pap nurse activity. The creation of a program logic model is a conscious strategy that results an explicit understanding of the challenges ahead, the resources available and time frames for outcomes. Program logic also enables a recognition that all players in the general practice arena need to be acknowledged by policy makers, bureaucrats and program designers when addressing through policy, issues relating to equity and accessibility of health initiatives. Logic modelling allows decision makers to consider the complexities of causal associations when developing health care proposals and programs. It enables the Pap nurse in general practice program to be represented diagrammatically by linking outcomes (short, medium and long term) with both the program activities and program assumptions. The research methodology used in the evaluation of the Pap nurse in general practice program includes a descriptive study design and the incorporation of program logic, with a retrospective analysis of Australian data from 2001 to 2009. For the purposes of gaining both empirical and contextual data for this paper, a data set analysis and literature review was performed. The application of program logic as an evaluative tool for analysis of the Pap PN incentive program facilitates a greater understanding of complex general practice activity triggers, and also allows this greater understanding to be incorporated into policy to facilitate Pap PN activity, increase general practice cervical smear and ultimately decrease burden of disease.

  12. Development of Boolean calculus and its applications. [digital systems design

    NASA Technical Reports Server (NTRS)

    Tapia, M. A.

    1980-01-01

    The development of Boolean calculus for its application to developing digital system design methodologies that would reduce system complexity, size, cost, speed, power requirements, etc., is discussed. Synthesis procedures for logic circuits are examined particularly asynchronous circuits using clock triggered flip flops.

  13. Development of fuzzy air quality index using soft computing approach.

    PubMed

    Mandal, T; Gorai, A K; Pathak, G

    2012-10-01

    Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.

  14. Evaluation models of some morphological characteristics for talent scouting in sport.

    PubMed

    Rogulj, Nenad; Papić, Vladan; Cavala, Marijana

    2009-03-01

    In this paper, for the purpose of expert system evaluation within the scientific project "Talent scouting in sport", two methodological approaches for recognizing an athlete's morphological compatibility for various sports has been presented, evaluated and compared. First approach is based on the fuzzy logic and expert opinion about compatibility of proposed hypothetical morphological models for 14 different sports which are part of the expert system. Second approach is based on determining the differences between morphological characteristics of a tested individual and top athlete's morphological characteristics for particular sport. Logical and mathematical bases of both methodological approaches have been explained in detail. High prognostic efficiency in recognition of individual's sport has been determined. Some improvements in further development of both methods have been proposed. Results of the research so far suggest that this or similar approaches can be successfully used for detection of individual's morphological compatibility for different sports. Also, it is expected to be useful in the selection of young talents for particular sport.

  15. Reconciling Pairs of Concurrently Used Clinical Practice Guidelines Using Constraint Logic Programming

    PubMed Central

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153

  16. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  17. [Methodologic inconsistency in anamnesis education at medical schools].

    PubMed

    Zago, M A

    1989-01-01

    Some relevant points of the process of obtaining the medical anamnesis and physical examination, and the formulation of diagnostic hypotheses are analyzed. The main methodological features include: preponderance of qualitative data, absence of preselected hypotheses, direct involvement of the observer (physician) with the data source (patient), and selection of hypotheses and changes of the patient during the process. Thus, diagnostic investigation does not follow the paradigm of quantitative scientific method, rooted on the logic positivism, which dominates medical research and education.

  18. Navigating a Mobile Robot Across Terrain Using Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Howard, Ayanna; Bon, Bruce

    2003-01-01

    A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.

  19. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  20. Medical equipment classification: method and decision-making support based on paraconsistent annotated logic.

    PubMed

    Oshiyama, Natália F; Bassani, Rosana A; D'Ottaviano, Itala M L; Bassani, José W M

    2012-04-01

    As technology evolves, the role of medical equipment in the healthcare system, as well as technology management, becomes more important. Although the existence of large databases containing management information is currently common, extracting useful information from them is still difficult. A useful tool for identification of frequently failing equipment, which increases maintenance cost and downtime, would be the classification according to the corrective maintenance data. Nevertheless, establishment of classes may create inconsistencies, since an item may be close to two classes by the same extent. Paraconsistent logic might help solve this problem, as it allows the existence of inconsistent (contradictory) information without trivialization. In this paper, a methodology for medical equipment classification based on the ABC analysis of corrective maintenance data is presented, and complemented with a paraconsistent annotated logic analysis, which may enable the decision maker to take into consideration alerts created by the identification of inconsistencies and indeterminacies in the classification.

  1. Fuzzy set methods for object recognition in space applications

    NASA Technical Reports Server (NTRS)

    Keller, James M.

    1991-01-01

    During the reporting period, the development of the theory and application of methodologies for decision making under uncertainty was addressed. Two subreports are included; the first on properties of general hybrid operators, while the second considers some new research on generalized threshold logic units. In the first part, the properties of the additive gamma-model, where the intersection part is first considered to be the product of the input values and the union part is obtained by an extension of De Morgan's law to fuzzy sets, is explored. Then the Yager's class of union and intersection is used in the additive gamma-model. The inputs are weighted to some power that represents their importance and thus their contribution to the compensation process. In the second part, the extension of binary logic synthesis methods to multiple valued logic synthesis methods to enable the synthesis of decision networks when the input/output variables are not binary is discussed.

  2. Assurance of Complex Electronics. What Path Do We Take?

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Many of the methods used to develop software bare a close resemblance to Complex Electronics (CE) development. CE are now programmed to perform tasks that were previously handled in software, such as communication protocols. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of "software-like" bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications to develop these devices. By using standardized S/W Engineering methods such as checklists, missing requirements and "bugs" can be detected earlier in the development cycle, thus creating a development process for CE that will be easily maintained and configurable based on the device used.

  3. Fuzzy logic and image processing techniques for the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Urrutia-Fucugauchi, J.; Rodríguez-Castellanos, A.

    2011-06-01

    Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation.

  4. Put Me in Coach: A Commentary on the RPSD Exchange.

    ERIC Educational Resources Information Center

    Hardman, Michael L.

    2003-01-01

    This commentary discusses principles concerning requirements for scientifically based research under No Child Left Behind: scientific inquiry begins with important research questions, not specific methodologies; the logic that scientifically based research equates with randomized controlled trials will result in research and practice disconnects;…

  5. Following Watery Relations in Early Childhood Pedagogies

    ERIC Educational Resources Information Center

    Pacini-Ketchabaw, Veronica; Clark, Vanessa

    2016-01-01

    Working methodologically and theoretically with the hydro-logics of bodies of water, this article addresses the limitations of humanistic perspectives on water play in early childhood classrooms, and proposes pedagogies of watery relations. The article traces the fluid, murky, surging, creative, unpredictable specificities of bodies of water that…

  6. Facilitating Coherence across Qualitative Research Papers

    ERIC Educational Resources Information Center

    Chenail, Ronald J.; Duffy, Maureen; St. George, Sally; Wulff, Dan

    2011-01-01

    Bringing the various elements of qualitative research papers into coherent textual patterns presents challenges for authors and editors alike. Although individual sections such as presentation of the problem, review of the literature, methodology, results, and discussion may each be constructed in a sound logical and structural sense, the…

  7. A Graph Summarization Algorithm Based on RFID Logistics

    NASA Astrophysics Data System (ADS)

    Sun, Yan; Hu, Kongfa; Lu, Zhipeng; Zhao, Li; Chen, Ling

    Radio Frequency Identification (RFID) applications are set to play an essential role in object tracking and supply chain management systems. The volume of data generated by a typical RFID application will be enormous as each item will generate a complete history of all the individual locations that it occupied at every point in time. The movement trails of such RFID data form gigantic commodity flowgraph representing the locations and durations of the path stages traversed by each item. In this paper, we use graph to construct a warehouse of RFID commodity flows, and introduce a database-style operation to summarize graphs, which produces a summary graph by grouping nodes based on user-selected node attributes, further allows users to control the hierarchy of summaries. It can cut down the size of graphs, and provide convenience for users to study just on the shrunk graph which they interested. Through extensive experiments, we demonstrate the effectiveness and efficiency of the proposed method.

  8. Intelligent control of a multi-degree-of freedom reaction compensating platform system using fuzzy logic

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.; Lawrence, Charles; Lin, Yueh-Jaw

    1994-01-01

    This paper presents the development of a general-purpose fuzzy logic (FL) control methodology for isolating the external vibratory disturbances of space-based devices. According to the desired performance specifications, a full investigation regarding the development of an FL controller was done using different scenarios, such as variances of passive reaction-compensating components and external disturbance load. It was shown that the proposed FL controller is robust in that the FL-controlled system closely follows the prespecified ideal reference model. The comparative study also reveals that the FL-controlled system achieves significant improvement in reducing vibrations over passive systems.

  9. A new methodology to integrate planetary quarantine requirements into mission planning, with application to a Jupiter orbiter

    NASA Technical Reports Server (NTRS)

    Howard, R. A.; North, D. W.; Pezier, J. P.

    1975-01-01

    A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.

  10. Mutual Relevance of Mainstream and Cross-Cultural Psychology.

    ERIC Educational Resources Information Center

    Clark, Lee Anna

    1987-01-01

    Asserts that mainstream and cross-cultural psychology address many of the same basic issues and that cross-cultural studies may be a direct and logical extension of the search for causes of variation in human psychology and psychopathology. Discusses differences in theoretical orientation and methodological approach and barriers to communication…

  11. Resignifying the Negative Space: Troubling the Representation of Learning

    ERIC Educational Resources Information Center

    Fendler, Rachel

    2017-01-01

    Informed by the results of a collaborative project carried out with six secondary school students, this paper reflects on the methodological and epistemological issues related to the representation of informal learning practices. Borrowing a concept from the arts, I suggest that a representationalist logic in both schooling and educational…

  12. The institutional logic of integrated care: an ethnography of patient transitions.

    PubMed

    Shaw, James A; Kontos, Pia; Martin, Wendy; Victor, Christina

    2017-03-20

    Purpose The purpose of this paper is to use theories of institutional logics and institutional entrepreneurship to examine how and why macro-, meso-, and micro-level influences inter-relate in the implementation of integrated transitional care out of hospital in the English National Health Service. Design/methodology/approach The authors conducted an ethnographic case study of a hospital and surrounding services within a large urban centre in England. Specific methods included qualitative interviews with patients/caregivers, health/social care providers, and organizational leaders; observations of hospital transition planning meetings, community "hub" meetings, and other instances of transition planning; reviews of patient records; and analysis of key policy documents. Analysis was iterative and informed by theory on institutional logics and institutional entrepreneurship. Findings Organizational leaders at the meso-level of health and social care promoted a partnership logic of integrated care in response to conflicting institutional ideas found within a key macro-level policy enacted in 2003 (The Community Care (Delayed Discharges) Act). Through institutional entrepreneurship at the micro-level, the partnership logic became manifest in the form of relationship work among health and social care providers; they sought to build strong interpersonal relationships to enact more integrated transitional care. Originality/value This study has three key implications. First, efforts to promote integrated care should strategically include institutional entrepreneurs at the organizational and clinical levels. Second, integrated care initiatives should emphasize relationship-building among health and social care providers. Finally, theoretical development on institutional logics should further examine the role of interpersonal relationships in facilitating the "spread" of logics between macro-, meso-, and micro-level influences on inter-organizational change.

  13. A new systematic and quantitative approach to characterization of surface nanostructures using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Al-Mousa, Amjed A.

    Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.

  14. De/colonizing methodologies in science education: rebraiding research theory-practice-ethics with Indigenous theories and theorists

    NASA Astrophysics Data System (ADS)

    Higgins, Marc; Kim, Eun-Ji Amy

    2018-02-01

    The purpose of this article is to differentially engage in the work of thinking with Indigenous theorists and theories with decolonizing science education research methodologies in mind. As a rejoinder to Tracey McMahon, Emily Griese, and DenYelle Baete Kenyon's Cultivating Native American scientists: An application of an Indigenous model to an undergraduate research experience, we extend the notion of educationally centering Indigenous processes, pedagogies, and protocols by considering methodology a site in which (neo-)colonial logics often linger. We suggest that (re)designing methodology with Indigenous theorists and theories is an important act of resistance, refusal, and resignification; we demonstrate this significance through braiding together narratives of our engagement in this task and provide insights as to what is produced or producible.

  15. An experimental comparison of fuzzy logic and analytic hierarchy process for medical decision support systems.

    PubMed

    Uzoka, Faith-Michael Emeka; Obot, Okure; Barker, Ken; Osuji, J

    2011-07-01

    The task of medical diagnosis is a complex one, considering the level vagueness and uncertainty management, especially when the disease has multiple symptoms. A number of researchers have utilized the fuzzy-analytic hierarchy process (fuzzy-AHP) methodology in handling imprecise data in medical diagnosis and therapy. The fuzzy logic is able to handle vagueness and unstructuredness in decision making, while the AHP has the ability to carry out pairwise comparison of decision elements in order to determine their importance in the decision process. This study attempts to do a case comparison of the fuzzy and AHP methods in the development of medical diagnosis system, which involves basic symptoms elicitation and analysis. The results of the study indicate a non-statistically significant relative superiority of the fuzzy technology over the AHP technology. Data collected from 30 malaria patients were used to diagnose using AHP and fuzzy logic independent of one another. The results were compared and found to covary strongly. It was also discovered from the results of fuzzy logic diagnosis covary a little bit more strongly to the conventional diagnosis results than that of AHP. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  16. Reconfigurable logic in nanosecond Cu/GeTe/TiN filamentary memristors for energy-efficient in-memory computing.

    PubMed

    Jin, Miaomiao; Cheng, Long; Li, Yi; Hu, Siyu; Lu, Ke; Chen, Jia; Duan, Nian; Wang, Zhuorui; Zhou, Yaxiong; Chang, Ting-Chang; Miao, Xiangshui

    2018-06-27

    Owing to the capability of integrating the information storage and computing in the same physical location, in-memory computing with memristors has become a research hotspot as a promising route for non von Neumann architecture. However, it is still a challenge to develop high performance devices as well as optimized logic methodologies to realize energy-efficient computing. Herein, filamentary Cu/GeTe/TiN memristor is reported to show satisfactory properties with nanosecond switching speed (< 60 ns), low voltage operation (< 2 V), high endurance (>104 cycles) and good retention (>104 s @85℃). It is revealed that the charge carrier conduction mechanisms in high resistance and low resistance states are Schottky emission and hopping transport between the adjacent Cu clusters, respectively, based on the analysis of current-voltage behaviors and resistance-temperature characteristics. An intuitive picture is given to describe the dynamic processes of resistive switching. Moreover, based on the basic material implication (IMP) logic circuit, we proposed a reconfigurable logic method and experimentally implemented IMP, NOT, OR, and COPY logic functions. Design of a one-bit full adder with reduction in computational sequences and its validation in simulation further demonstrate the potential practical application. The results provide important progress towards understanding of resistive switching mechanism and realization of energy-efficient in-memory computing architecture. © 2018 IOP Publishing Ltd.

  17. A discriminative method for family-based protein remote homology detection that combines inductive logic programming and propositional models

    PubMed Central

    2011-01-01

    Background Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). Results We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. Conclusions The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions. PMID:21429187

  18. A discriminative method for family-based protein remote homology detection that combines inductive logic programming and propositional models.

    PubMed

    Bernardes, Juliana S; Carbone, Alessandra; Zaverucha, Gerson

    2011-03-23

    Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions.

  19. Fuzzy Current-Mode Control and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2000-01-01

    In this paper a current-mode control (CMC) methodology is developed for a buck converter by using a fuzzy logic controller. Conventional CMC methodologies are based on lead-lag compensation with voltage and inductor current feedback. In this paper the converter lead-lag compensation will be substituted with a fuzzy controller. A small-signal model of the fuzzy controller will also be developed in order to examine the stability properties of this buck converter control system. The paper develops an analytical approach, introducing fuzzy control into the area of CMC.

  20. Nanowire nanocomputer as a finite-state machine.

    PubMed

    Yao, Jun; Yan, Hao; Das, Shamik; Klemic, James F; Ellenbogen, James C; Lieber, Charles M

    2014-02-18

    Implementation of complex computer circuits assembled from the bottom up and integrated on the nanometer scale has long been a goal of electronics research. It requires a design and fabrication strategy that can address individual nanometer-scale electronic devices, while enabling large-scale assembly of those devices into highly organized, integrated computational circuits. We describe how such a strategy has led to the design, construction, and demonstration of a nanoelectronic finite-state machine. The system was fabricated using a design-oriented approach enabled by a deterministic, bottom-up assembly process that does not require individual nanowire registration. This methodology allowed construction of the nanoelectronic finite-state machine through modular design using a multitile architecture. Each tile/module consists of two interconnected crossbar nanowire arrays, with each cross-point consisting of a programmable nanowire transistor node. The nanoelectronic finite-state machine integrates 180 programmable nanowire transistor nodes in three tiles or six total crossbar arrays, and incorporates both sequential and arithmetic logic, with extensive intertile and intratile communication that exhibits rigorous input/output matching. Our system realizes the complete 2-bit logic flow and clocked control over state registration that are required for a finite-state machine or computer. The programmable multitile circuit was also reprogrammed to a functionally distinct 2-bit full adder with 32-set matched and complete logic output. These steps forward and the ability of our unique design-oriented deterministic methodology to yield more extensive multitile systems suggest that proposed general-purpose nanocomputers can be realized in the near future.

  1. Nanowire nanocomputer as a finite-state machine

    PubMed Central

    Yao, Jun; Yan, Hao; Das, Shamik; Klemic, James F.; Ellenbogen, James C.; Lieber, Charles M.

    2014-01-01

    Implementation of complex computer circuits assembled from the bottom up and integrated on the nanometer scale has long been a goal of electronics research. It requires a design and fabrication strategy that can address individual nanometer-scale electronic devices, while enabling large-scale assembly of those devices into highly organized, integrated computational circuits. We describe how such a strategy has led to the design, construction, and demonstration of a nanoelectronic finite-state machine. The system was fabricated using a design-oriented approach enabled by a deterministic, bottom–up assembly process that does not require individual nanowire registration. This methodology allowed construction of the nanoelectronic finite-state machine through modular design using a multitile architecture. Each tile/module consists of two interconnected crossbar nanowire arrays, with each cross-point consisting of a programmable nanowire transistor node. The nanoelectronic finite-state machine integrates 180 programmable nanowire transistor nodes in three tiles or six total crossbar arrays, and incorporates both sequential and arithmetic logic, with extensive intertile and intratile communication that exhibits rigorous input/output matching. Our system realizes the complete 2-bit logic flow and clocked control over state registration that are required for a finite-state machine or computer. The programmable multitile circuit was also reprogrammed to a functionally distinct 2-bit full adder with 32-set matched and complete logic output. These steps forward and the ability of our unique design-oriented deterministic methodology to yield more extensive multitile systems suggest that proposed general-purpose nanocomputers can be realized in the near future. PMID:24469812

  2. Formalization of software requirements for information systems using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Yegorov, Y. S.; Milov, V. R.; Kvasov, A. S.; Sorokoumova, S. N.; Suvorova, O. V.

    2018-05-01

    The paper considers an approach to the design of information systems based on flexible software development methodologies. The possibility of improving the management of the life cycle of information systems by assessing the functional relationship between requirements and business objectives is described. An approach is proposed to establish the relationship between the degree of achievement of business objectives and the fulfillment of requirements for the projected information system. It describes solutions that allow one to formalize the process of formation of functional and non-functional requirements with the help of fuzzy logic apparatus. The form of the objective function is formed on the basis of expert knowledge and is specified via learning from very small data set.

  3. A fuzzy logic approach to modeling a vehicle crash test

    NASA Astrophysics Data System (ADS)

    Pawlus, Witold; Karimi, Hamid Reza; Robbersmyr, Kjell G.

    2013-03-01

    This paper presents an application of fuzzy approach to vehicle crash modeling. A typical vehicle to pole collision is described and kinematics of a car involved in this type of crash event is thoroughly characterized. The basics of fuzzy set theory and modeling principles based on fuzzy logic approach are presented. In particular, exceptional attention is paid to explain the methodology of creation of a fuzzy model of a vehicle collision. Furthermore, the simulation results are presented and compared to the original vehicle's kinematics. It is concluded which factors have influence on the accuracy of the fuzzy model's output and how they can be adjusted to improve the model's fidelity.

  4. Logic of discovery or psychology of invention?

    NASA Astrophysics Data System (ADS)

    Woodward, James F.

    1992-02-01

    It is noted that Popper separates the creation of concepts, conjectures, hypotheses and theories—the context of invention—from the testing thereof—the context of justification—arguing that only the latter is susceptible of rigorous logical analysis. Efforts on the part of others to shift or eradicate the demarcation established by this distinction are discussed and the relationship of these considerations to the claims of “strong artificial intelligence” is pointed out. It is argued that the mode of education of scientists, as well as reports of celebrated scientists, support Popper's judgement in this matter. An historical episode from Faraday's later career is used to illustrate the historiographical strength of Lakatos' “methodology of research programs.”

  5. Constraint Logic Programming approach to protein structure prediction.

    PubMed

    Dal Palù, Alessandro; Dovier, Agostino; Fogolari, Federico

    2004-11-30

    The protein structure prediction problem is one of the most challenging problems in biological sciences. Many approaches have been proposed using database information and/or simplified protein models. The protein structure prediction problem can be cast in the form of an optimization problem. Notwithstanding its importance, the problem has very seldom been tackled by Constraint Logic Programming, a declarative programming paradigm suitable for solving combinatorial optimization problems. Constraint Logic Programming techniques have been applied to the protein structure prediction problem on the face-centered cube lattice model. Molecular dynamics techniques, endowed with the notion of constraint, have been also exploited. Even using a very simplified model, Constraint Logic Programming on the face-centered cube lattice model allowed us to obtain acceptable results for a few small proteins. As a test implementation their (known) secondary structure and the presence of disulfide bridges are used as constraints. Simplified structures obtained in this way have been converted to all atom models with plausible structure. Results have been compared with a similar approach using a well-established technique as molecular dynamics. The results obtained on small proteins show that Constraint Logic Programming techniques can be employed for studying protein simplified models, which can be converted into realistic all atom models. The advantage of Constraint Logic Programming over other, much more explored, methodologies, resides in the rapid software prototyping, in the easy way of encoding heuristics, and in exploiting all the advances made in this research area, e.g. in constraint propagation and its use for pruning the huge search space.

  6. Visual unit analysis: a descriptive approach to landscape assessment

    Treesearch

    R. J. Tetlow; S. R. J. Sheppard

    1979-01-01

    Analysis of the visible attributes of landscapes is an important component of the planning process. When landscapes are at regional scale, economical and effective methodologies are critical. The Visual Unit concept appears to offer a logical and useful framework for description and evaluation. The concept subdivides landscape into coherent, spatially-defined units....

  7. Maturity Curve of Systems Engineering

    DTIC Science & Technology

    2008-12-01

    b. Analysis of Data .......................................................... 41 4. Fuzzy Logic...the collection and analysis of data . (Hart, 1998) 13 1. Methodology Overview A qualitative approach in acquiring and managing the data was used...for this analysis . A quantitative tool was used to examine and evaluate the data . The qualitative approach was intended to sort the acquired traits

  8. The PLATO System: A Study in the Diffusion of an Innovation.

    ERIC Educational Resources Information Center

    Driscoll, Francis D.; Wolf, W. C., Jr.

    This study was designed to ascertain the relationships between the steps of a tool designed to link knowledge production and the needs of knowledge users (the Wolf-Welsh Linkage Methodology or WWLM) with milestones in the evolution of an innovative computer-assisted instructional system called PLATO (Programming Logic for Advanced Teaching…

  9. Fifteen-Year-Old Pupils' Variable Handling Performance in the Context of Scientific Investigations.

    ERIC Educational Resources Information Center

    Donnelly, J. F.

    1987-01-01

    Reports findings on variable-handling aspects of pupil performance in investigatory tasks, using data from the British Assessment of Performance Unit (APU) national survey program. Discusses the significance of these findings for assessment methodology and for understanding of 15-year-olds' approaches to the variable-based logic of investigation.…

  10. Chaim Perelman Re-examined: An Application to Classroom Methodology.

    ERIC Educational Resources Information Center

    Clines, Raymond H.

    Rhetoric handbooks and composition texts continue to teach that the main techniques of effective argumentation are based on logic--the use of evidence, deductive and inductive reasoning, definition of terms, and so on. Yet, as Chaim Perelman and L. Olbrechts-Tyteca argue in their book "The New Rhetoric," the most solid beliefs are those…

  11. Contextuality and Cultural Texts: A Case Study of Workplace Learning in Call Centres

    ERIC Educational Resources Information Center

    Crouch, Margaret

    2006-01-01

    Purpose: The paper seeks to show the contextualisation of call centres as a work-specific ethnographically and culturally based community, which, in turn, influences pedagogical practices through the encoding and decoding of cultural texts in relation to two logics: cost-efficiency and customer-orientation. Design/methodology/approach: The paper…

  12. A Practical Methodology for the Systematic Development of Multiple Choice Tests.

    ERIC Educational Resources Information Center

    Blumberg, Phyllis; Felner, Joel

    Using Guttman's facet design analysis, four parallel forms of a multiple-choice test were developed. A mapping sentence, logically representing the universe of content of a basic cardiology course, specified the facets of the course and the semantic structural units linking them. The facets were: cognitive processes, disease priority, specific…

  13. Modeling Spring Mass System with System Dynamics Approach in Middle School Education

    ERIC Educational Resources Information Center

    Nuhoglu, Hasret

    2008-01-01

    System Dynamics is a well formulated methodology for analyzing the components of a system including causeeffect relationships and their underlying mathematics and logic, time delays, and feedback loops. It began in the business and manufacturing world, but is now affecting education and many other disciplines. Having inspired by successful policy…

  14. Modeling Spring Mass System with System Dynamics Approach in Middle School Education

    ERIC Educational Resources Information Center

    Nuhoglu, Hasret

    2008-01-01

    System Dynamics is a well formulated methodology for analyzing the components of a system including cause-effect relationships and their underlying mathematics and logic, time delays, and feedback loops. It began in the business and manufacturing world, but is now affecting education and many other disciplines. Having inspired by successful policy…

  15. Comparing Effects of School Inspections in Sweden and Austria

    ERIC Educational Resources Information Center

    Kemethofer, David; Gustafsson, Jan-Eric; Altrichter, Herbert

    2017-01-01

    In recent years, school inspections have been newly introduced or adapted to the evidence-based governance logic in many European countries. So far, empirical research on the impact of school inspections has produced inconclusive results. Methodologically, it has mainly focussed on analysis of a national inspection model and used cross-sectional…

  16. Global University Rankings--Impacts and Unintended Side Effects

    ERIC Educational Resources Information Center

    Kehm, Barbara M.

    2014-01-01

    In this article, global and other university rankings are critically assessed with regard to their unintended side effects and their impacts on the European and national landscape of universities, as well as on individual institutions. An emphasis is put on the effects of ranking logics rather than on criticising their methodology. Nevertheless,…

  17. Boolean Classes and Qualitative Inquiry. WCER Working Paper No. 2006-3

    ERIC Educational Resources Information Center

    Nathan, Mitchell J.; Jackson, Kristi

    2006-01-01

    The prominent role of Boolean classes in qualitative data analysis software is viewed by some as an encroachment of logical positivism on qualitative research methodology. The authors articulate an embodiment perspective, in which Boolean classes are viewed as conceptual metaphors for apprehending and manipulating data, concepts, and categories in…

  18. Elaborer un exercice de grammaire (Working Out a Grammar Exercise)

    ERIC Educational Resources Information Center

    Principaud, Jeanne-Marie

    1977-01-01

    An elaboration of the official instruction on teaching French to native speakers in elementary school. The topics covered are: Methodological development of exercises; the linguistic ability and milieu of the students; operative criteria; and the question of a logical progression or spontaneous use of grammar exercises. (Text is in French.) (AMH)

  19. The Suffolk County Department of Social Services Performance Study. An Executive Summary.

    ERIC Educational Resources Information Center

    Spottheim, David; Wilson, George R.

    The logic and methodology applied in a management science approach to performance and staff utilization in the Client Benefits (CBA) and Community Service (CSA) divisions of the Suffolk County (New York) Department of Social Services (SCDSS) are described. Using a blend of classical organization theory and management science techniques, the CBA…

  20. Composite and Loose Concepts, Historical Analogies, and the Logic of Control in Comparative Historical Analysis

    ERIC Educational Resources Information Center

    Møller, Jørgen

    2016-01-01

    The use of controlled comparisons pervades comparative historical analysis. Heated debates have surrounded the methodological purchase of such comparisons. However, the quality and validity of the conceptual building blocks on which the comparisons are based have largely been ignored. This article discusses a particular problem pertaining to these…

  1. Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges.

    PubMed

    Chatterji, Madhabi

    2016-12-01

    This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Methodology for designing and implementing a class for service for the transmission of medical images over a common network

    NASA Astrophysics Data System (ADS)

    Dimond, David A.; Burgess, Robert; Barrios, Nolan; Johnson, Neil D.

    2000-05-01

    Traditionally, to guarantee the network performance of medical image data transmission, imaging traffic was isolated on a separate network. Organizations are depending on a new generation of multi-purpose networks to transport both normal information and image traffic as they expand access to images throughout the enterprise. These organi want to leverage their existing infrastructure for imaging traffic, but are not willing to accept degradations in overall network performance. To guarantee 'on demand' network performance for image transmissions anywhere at any time, networks need to be designed with the ability to 'carve out' bandwidth for specific applications and to minimize the chances of network failures. This paper will present the methodology Cincinnati Children's Hospital Medical Center (CHMC) used to enhance the physical and logical network design of the existing hospital network to guarantee a class of service for imaging traffic. PACS network designs should utilize the existing enterprise local area network i.e. (LAN) infrastructure where appropriate. Logical separation or segmentation provides the application independence from other clinical and administrative applications as required, ensuring bandwidth and service availability.

  3. Engineered modular biomaterial logic gates for environmentally triggered therapeutic delivery

    NASA Astrophysics Data System (ADS)

    Badeau, Barry A.; Comerford, Michael P.; Arakawa, Christopher K.; Shadish, Jared A.; Deforest, Cole A.

    2018-03-01

    The successful transport of drug- and cell-based therapeutics to diseased sites represents a major barrier in the development of clinical therapies. Targeted delivery can be mediated through degradable biomaterial vehicles that utilize disease biomarkers to trigger payload release. Here, we report a modular chemical framework for imparting hydrogels with precise degradative responsiveness by using multiple environmental cues to trigger reactions that operate user-programmable Boolean logic. By specifying the molecular architecture and connectivity of orthogonal stimuli-labile moieties within material cross-linkers, we show selective control over gel dissolution and therapeutic delivery. To illustrate the versatility of this methodology, we synthesized 17 distinct stimuli-responsive materials that collectively yielded all possible YES/OR/AND logic outputs from input combinations involving enzyme, reductant and light. Using these hydrogels we demonstrate the first sequential and environmentally stimulated release of multiple cell lines in well-defined combinations from a material. We expect these platforms will find utility in several diverse fields including drug delivery, diagnostics and regenerative medicine.

  4. Filling the Assurance Gap on Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Many of the methods used to develop software bare a close resemblance to Complex Electronics (CE) development. CE are now programmed to perform tasks that were previously handled by software, such as communication protocols. For example, the James Webb Space Telescope will use Field Programmable Gate Arrays (FPGAs), which can have over a million logic gates, to send telemetry. System-on-chip (SoC) devices, another type of complex electronics, can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, mature software methodologies have been proposed, with slight modifications, to develop these devices. By using standardized S/W Engineering methods such as checklists, missing requirements and bugs can be detected earlier in the development cycle, thus creating a development process for CE that can be easily maintained and configurable based on the device used.

  5. Software Process Assurance for Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.

  6. Classification of reflected signals from cavitated tooth surfaces using an artificial intelligence technique incorporating a fiber optic displacement sensor

    NASA Astrophysics Data System (ADS)

    Rahman, Husna Abdul; Harun, Sulaiman Wadi; Arof, Hamzah; Irawati, Ninik; Musirin, Ismail; Ibrahim, Fatimah; Ahmad, Harith

    2014-05-01

    An enhanced dental cavity diameter measurement mechanism using an intensity-modulated fiber optic displacement sensor (FODS) scanning and imaging system, fuzzy logic as well as a single-layer perceptron (SLP) neural network, is presented. The SLP network was employed for the classification of the reflected signals, which were obtained from the surfaces of teeth samples and captured using FODS. Two features were used for the classification of the reflected signals with one of them being the output of a fuzzy logic. The test results showed that the combined fuzzy logic and SLP network methodology contributed to a 100% classification accuracy of the network. The high-classification accuracy significantly demonstrates the suitability of the proposed features and classification using SLP networks for classifying the reflected signals from teeth surfaces, enabling the sensor to accurately measure small diameters of tooth cavity of up to 0.6 mm. The method remains simple enough to allow its easy integration in existing dental restoration support systems.

  7. Engineered modular biomaterial logic gates for environmentally triggered therapeutic delivery.

    PubMed

    Badeau, Barry A; Comerford, Michael P; Arakawa, Christopher K; Shadish, Jared A; DeForest, Cole A

    2018-03-01

    The successful transport of drug- and cell-based therapeutics to diseased sites represents a major barrier in the development of clinical therapies. Targeted delivery can be mediated through degradable biomaterial vehicles that utilize disease biomarkers to trigger payload release. Here, we report a modular chemical framework for imparting hydrogels with precise degradative responsiveness by using multiple environmental cues to trigger reactions that operate user-programmable Boolean logic. By specifying the molecular architecture and connectivity of orthogonal stimuli-labile moieties within material cross-linkers, we show selective control over gel dissolution and therapeutic delivery. To illustrate the versatility of this methodology, we synthesized 17 distinct stimuli-responsive materials that collectively yielded all possible YES/OR/AND logic outputs from input combinations involving enzyme, reductant and light. Using these hydrogels we demonstrate the first sequential and environmentally stimulated release of multiple cell lines in well-defined combinations from a material. We expect these platforms will find utility in several diverse fields including drug delivery, diagnostics and regenerative medicine.

  8. WARP: Weight Associative Rule Processor. A dedicated VLSI fuzzy logic megacell

    NASA Technical Reports Server (NTRS)

    Pagni, A.; Poluzzi, R.; Rizzotto, G. G.

    1992-01-01

    During the last five years Fuzzy Logic has gained enormous popularity in the academic and industrial worlds. The success of this new methodology has led the microelectronics industry to create a new class of machines, called Fuzzy Machines, to overcome the limitations of traditional computing systems when utilized as Fuzzy Systems. This paper gives an overview of the methods by which Fuzzy Logic data structures are represented in the machines (each with its own advantages and inefficiencies). Next, the paper introduces WARP (Weight Associative Rule Processor) which is a dedicated VLSI megacell allowing the realization of a fuzzy controller suitable for a wide range of applications. WARP represents an innovative approach to VLSI Fuzzy controllers by utilizing different types of data structures for characterizing the membership functions during the various stages of the Fuzzy processing. WARP dedicated architecture has been designed in order to achieve high performance by exploiting the computational advantages offered by the different data representations.

  9. A Sliding Mode Controller Using Nonlinear Sliding Surface Improved With Fuzzy Logic: Application to the Coupled Tanks System

    NASA Astrophysics Data System (ADS)

    Boubakir, A.; Boudjema, F.; Boubakir, C.

    2008-06-01

    This paper proposes an approach of hybrid control that is based on the concept of combining fuzzy logic and the methodology of sliding mode control (SMC). In the present works, a first-order nonlinear sliding surface is presented, on which the developed control law is based. Mathematical proof for the stability and convergence of the system is presented. In order to reduce the chattering in sliding mode control, a fixed boundary layer around the switch surface is used. Within the boundary layer, since the fuzzy logic control is applied, the chattering phenomenon, which is inherent in a sliding mode control, is avoided by smoothing the switch signal. Outside the boundary, the sliding mode control is applied to driving the system states into the boundary layer. Experimental studies carried out on a coupled Tanks system indicate that the proposed fuzzy sliding mode control (FSMC) is a good candidate for control applications.

  10. Classification of reflected signals from cavitated tooth surfaces using an artificial intelligence technique incorporating a fiber optic displacement sensor.

    PubMed

    Rahman, Husna Abdul; Harun, Sulaiman Wadi; Arof, Hamzah; Irawati, Ninik; Musirin, Ismail; Ibrahim, Fatimah; Ahmad, Harith

    2014-05-01

    An enhanced dental cavity diameter measurement mechanism using an intensity-modulated fiber optic displacement sensor (FODS) scanning and imaging system, fuzzy logic as well as a single-layer perceptron (SLP) neural network, is presented. The SLP network was employed for the classification of the reflected signals, which were obtained from the surfaces of teeth samples and captured using FODS. Two features were used for the classification of the reflected signals with one of them being the output of a fuzzy logic. The test results showed that the combined fuzzy logic and SLP network methodology contributed to a 100% classification accuracy of the network. The high-classification accuracy significantly demonstrates the suitability of the proposed features and classification using SLP networks for classifying the reflected signals from teeth surfaces, enabling the sensor to accurately measure small diameters of tooth cavity of up to 0.6 mm. The method remains simple enough to allow its easy integration in existing dental restoration support systems.

  11. Fuzzy attitude control for a nanosatellite in leo orbit

    NASA Astrophysics Data System (ADS)

    Calvo, Daniel; Laverón-Simavilla, Ana; Lapuerta, Victoria; Aviles, Taisir

    Fuzzy logic controllers are flexible and simple, suitable for small satellites Attitude Determination and Control Subsystems (ADCS). In this work, a tailored fuzzy controller is designed for a nanosatellite and is compared with a traditional Proportional Integrative Derivative (PID) controller. Both control methodologies are compared within the same specific mission. The orbit height varies along the mission from injection at around 380 km down to a 200 km height orbit, and the mission requires pointing accuracy over the whole time. Due to both the requirements imposed by such a low orbit, and the limitations in the power available for the attitude control, a robust and efficient ADCS is required. For these reasons a fuzzy logic controller is implemented as the brain of the ADCS and its performance and efficiency are compared to a traditional PID. The fuzzy controller is designed in three separated controllers, each one acting on one of the Euler angles of the satellite in an orbital frame. The fuzzy memberships are constructed taking into account the mission requirements, the physical properties of the satellite and the expected performances. Both methodologies, fuzzy and PID, are fine-tuned using an automated procedure to grant maximum efficiency with fixed performances. Finally both methods are probed in different environments to test their characteristics. The simulations show that the fuzzy controller is much more efficient (up to 65% less power required) in single maneuvers, achieving similar, or even better, precision than the PID. The accuracy and efficiency improvement of the fuzzy controller increase with orbit height because the environmental disturbances decrease, approaching the ideal scenario. A brief mission description is depicted as well as the design process of both ADCS controllers. Finally the validation process and the results obtained during the simulations are described. Those results show that the fuzzy logic methodology is valid for small satellites' missions benefiting from a well-developed artificial intelligence theory.

  12. Design, Specification, and Synthesis of Aircraft Electric Power Systems Control Logic

    NASA Astrophysics Data System (ADS)

    Xu, Huan

    Cyber-physical systems integrate computation, networking, and physical processes. Substantial research challenges exist in the design and verification of such large-scale, distributed sensing, actuation, and control systems. Rapidly improving technology and recent advances in control theory, networked systems, and computer science give us the opportunity to drastically improve our approach to integrated flow of information and cooperative behavior. Current systems rely on text-based specifications and manual design. Using new technology advances, we can create easier, more efficient, and cheaper ways of developing these control systems. This thesis will focus on design considerations for system topologies, ways to formally and automatically specify requirements, and methods to synthesize reactive control protocols, all within the context of an aircraft electric power system as a representative application area. This thesis consists of three complementary parts: synthesis, specification, and design. The first section focuses on the synthesis of central and distributed reactive controllers for an aircraft elec- tric power system. This approach incorporates methodologies from computer science and control. The resulting controllers are correct by construction with respect to system requirements, which are formulated using the specification language of linear temporal logic (LTL). The second section addresses how to formally specify requirements and introduces a domain-specific language for electric power systems. A software tool automatically converts high-level requirements into LTL and synthesizes a controller. The final sections focus on design space exploration. A design methodology is proposed that uses mixed-integer linear programming to obtain candidate topologies, which are then used to synthesize controllers. The discrete-time control logic is then verified in real-time by two methods: hardware and simulation. Finally, the problem of partial observability and dynamic state estimation is explored. Given a set placement of sensors on an electric power system, measurements from these sensors can be used in conjunction with control logic to infer the state of the system.

  13. Where Have All the Teachers Gone: A Case Study in Transitioning

    ERIC Educational Resources Information Center

    Potgieter, Amanda S.

    2016-01-01

    This paper reports the autobiographical narrative of Mr. L., as case-in-point example of the thresholding moment and the process of transitioning into Academia. The role of the lecturer-mentor and the multi-logic space that facilitates the process are clarified. I use hermeneutic phenomenology and interpretivism as methodological tools. This ex…

  14. Freud, Plato and Irigaray: A Morpho-Logic of Teaching and Learning

    ERIC Educational Resources Information Center

    Peers, Chris

    2012-01-01

    This article discusses two well-known texts that respectively describe learning and teaching, drawn from the work of Freud and Plato. These texts are considered in psychoanalytic terms using a methodology drawn from the philosophy of Luce Irigaray. In particular the article addresses Irigaray's approach to the analysis of speech and utterance as a…

  15. The Epistemic of Aesthetic Knowledge and Knowing: Implications for Aesthetic Education Curricula and Rational Pedagogy in Nigerian Secondary Schools

    ERIC Educational Resources Information Center

    Aghaosa, Ike P.

    2015-01-01

    Using essentially the philosophical and documentary, methodologies of language and logical analysis and deductions, analogical inference; and historical inspection of documents, the paper examined the issues and arguments involved in Aesthetics as an epistemological concept. These were in terms of aesthetic: knowledge, faculty of knowing and…

  16. An Analysis of Confederate Subsistence Logistics

    DTIC Science & Technology

    1989-09-01

    Questions 2 Agricultural Production 2 Transportation 2 Administration 2 Justification for Research 3 Methodology 7 ’-7The Logical Argument . The...Impact of Transportation Policies 67 Overview 67 Impaot of the Industrial Revolution On Warfare . 67 The Desolation of Northern Virginia 68 Transportation ...143 Inadequate Comprehensive Planning 144 Agricultural Policies 147 Transportation Policies 148 Administrative Policies 149 National Policies 149

  17. Bring Your Own Device to Language Class--Applying Handheld Devices in Classroom Learning

    ERIC Educational Resources Information Center

    Talmo, Tord; Einum, Even; Støckert, Robin

    2014-01-01

    Language students often struggle to understand the logic in foreign language grammar, reducing their ability to reproduce and create texts on their own. There are several reasons for this; everything from the methodology to lack of motivation might influence the situation. Since the 1980's, Computer Assisted Language Learning (CALL) has become one…

  18. Remote Control Laboratory Using EJS Applets and TwinCAT Programmable Logic Controllers

    ERIC Educational Resources Information Center

    Besada-Portas, E.; Lopez-Orozco, J. A.; de la Torre, L.; de la Cruz, J. M.

    2013-01-01

    This paper presents a new methodology to develop remote laboratories for systems engineering and automation control courses, based on the combined use of TwinCAT, a laboratory Java server application, and Easy Java Simulations (EJS). The TwinCAT system is used to close the control loop for the selected plants by means of programmable logic…

  19. Constructivism and Reflectivism as the Logical Counterparts in TESOL: Learning Theory versus Teaching Methodology

    ERIC Educational Resources Information Center

    al Mahmud, Abdullah

    2013-01-01

    The gist of the entire constructivist learning theory is that learners are self-builders of their learning that occurs through a mental process in a social context or communication setting, and teachers as facilitators generate learning by creating the expected environment and/or utilizing the process. This article theoretically proves…

  20. Considerations for the Systematic Analysis and Use of Single-Case Research

    ERIC Educational Resources Information Center

    Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith

    2012-01-01

    Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…

  1. Implementing Motivational Features in Reactive Blended Learning: Application to an Introductory Control Engineering Course

    ERIC Educational Resources Information Center

    Mendez, J. A.; Gonzalez, E. J.

    2011-01-01

    This paper presents a significant advance in a reactive blended learning methodology applied to an introductory control engineering course. This proposal was based on the inclusion of a reactive element (a fuzzy-logic-based controller) designed to regulate the workload for each student according to his/her activity and performance. The…

  2. A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing

    DTIC Science & Technology

    2014-03-01

    possible mappings ...................................................60 Table 25. Possible optimal leaf -nodes... size weight and power UAV unmanned aerial vehicle UHF ultra-high frequency UML universal modeling language Verilog verify logic VHDL VHSIC...optimal leaf -nodes to some design patterns for embedded system design. Software and hardware partitioning is a very difficult challenge in the field of

  3. Problem Behaviour at Early Age--Basis for Prediction of Asocial Behaviour

    ERIC Educational Resources Information Center

    Krneta, Dragoljub; Ševic, Aleksandra

    2015-01-01

    This paper analyzes the results of the study of prevalence of problem behaviour of students in primary and secondary schools. The starting point is that it is methodologically and logically justified to look for early forms of problem behaviour of students, because it is likely that adult convicted offenders at an early school age manifested forms…

  4. The Use of Metaphors as a Parametric Design Teaching Model: A Case Study

    ERIC Educational Resources Information Center

    Agirbas, Asli

    2018-01-01

    Teaching methodologies for parametric design are being researched all over the world, since there is a growing demand for computer programming logic and its fabrication process in architectural education. The computer programming courses in architectural education are usually done in a very short period of time, and so students have no chance to…

  5. Using logic model methods in systematic review synthesis: describing complex pathways in referral management interventions

    PubMed Central

    2014-01-01

    Background There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. Methods This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. Results The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. Conclusions The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. Trial registration number PROSPERO registration number: CRD42013004037. PMID:24885751

  6. Using logic model methods in systematic review synthesis: describing complex pathways in referral management interventions.

    PubMed

    Baxter, Susan K; Blank, Lindsay; Woods, Helen Buckley; Payne, Nick; Rimmer, Melanie; Goyder, Elizabeth

    2014-05-10

    There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. PROSPERO registration number: CRD42013004037.

  7. [Methodologic developmental principles of standardized surveys within the scope of social gerontologic studies].

    PubMed

    Bansemir, G

    1987-01-01

    The conception and evaluation of standardized oral or written questioning as quantifying instruments of research orientate by the basic premises of Marxist-Leninist theory of recognition and general scientific logic. In the present contribution the socio-gerontological research process is outlined in extracts. By referring to the intrinsic connection between some of its essential components--problem, formation of hypotheses, obtaining indicators/measurement, preliminary examination, evaluation-as well as to typical errors and (fictitious) examples of practical research, this contribution contrasts the natural, apparently uncomplicated course of structured questioning with its qualitative methodological fundamentals and demands.

  8. Syntax, concepts, and logic in the temporal dynamics of language comprehension: evidence from event-related potentials.

    PubMed

    Steinhauer, Karsten; Drury, John E; Portner, Paul; Walenski, Matthew; Ullman, Michael T

    2010-05-01

    Logic has been intertwined with the study of language and meaning since antiquity, and such connections persist in present day research in linguistic theory (formal semantics) and cognitive psychology (e.g., studies of human reasoning). However, few studies in cognitive neuroscience have addressed logical dimensions of sentence-level language processing, and none have directly compared these aspects of processing with syntax and lexical/conceptual-semantics. We used ERPs to examine a violation paradigm involving "Negative Polarity Items" or NPIs (e.g., ever/any), which are sensitive to logical/truth-conditional properties of the environments in which they occur (e.g., presence/absence of negation in: John hasn't ever been to Paris, versus: John has *ever been to Paris). Previous studies examining similar types of contrasts found a mix of effects on familiar ERP components (e.g., LAN, N400, P600). We argue that their experimental designs and/or analyses were incapable of separating which effects are connected to NPI-licensing violations proper. Our design enabled statistical analyses teasing apart genuine violation effects from independent effects tied solely to lexical/contextual factors. Here unlicensed NPIs elicited a late P600 followed in onset by a late left anterior negativity (or "L-LAN"), an ERP profile which has also appeared elsewhere in studies targeting logical semantics. Crucially, qualitatively distinct ERP-profiles emerged for syntactic and conceptual semantic violations which we also tested here. We discuss how these findings may be linked to previous findings in the ERP literature. Apart from methodological recommendations, we suggest that the study of logical semantics may aid advancing our understanding of the underlying neurocognitive etiology of ERP components. 2010 Elsevier Ltd. All rights reserved.

  9. SYNTAX, CONCEPTS, AND LOGIC IN THE TEMPORAL DYNAMICS OF LANGUAGE COMPREHENSION: EVIDENCE FROM EVENT RELATED POTENTIALS

    PubMed Central

    Steinhauer, Karsten; Drury, John E.; Portner, Paul; Walenski, Matthew; Ullman, Michael T.

    2010-01-01

    Logic has been intertwined with the study of language and meaning since antiquity, and such connections persist in present day research in linguistic theory (formal semantics) and cognitive psychology (e.g., studies of human reasoning). However, few studies in cognitive neuroscience have addressed logical dimensions of sentence-level language processing, and none have directly compared these aspects of processing with syntax and lexical/conceptual-semantics. We used ERPs to examine a violation paradigm involving “Negative Polarity Items” or NPIs (e.g., ever/any), which are sensitive to logical/truth-conditional properties of the environments in which they occur (e.g., presence/absence of negation in: John hasn’t ever been to Paris, versus: John has *ever been to Paris). Previous studies examining similar types of contrasts found a mix of effects on familiar ERP components (e.g., LAN, N400, P600). We argue that their experimental designs and/or analyses were incapable of separating which effects are connected to NPI-licensing violations proper. Our design enabled statistical analyses teasing apart genuine violation effects from independent effects tied solely to lexical/contextual factors. Here unlicensed NPIs elicited a late P600 followed in onset by a late left anterior negativity (or “L-LAN”), an ERP profile which has also appeared elsewhere in studies targeting logical semantics. Crucially, qualitatively distinct ERP-profiles emerged for syntactic and conceptual semantic violations which we also tested here. We discuss how these findings may be linked to previous findings in the ERP literature. Apart from methodological recommendations, we suggest that the study of logical semantics may aid advancing our understanding of the underlying neurocognitive etiology of ERP components. PMID:20138065

  10. Fuzzy logic and neural networks in artificial intelligence and pattern recognition

    NASA Astrophysics Data System (ADS)

    Sanchez, Elie

    1991-10-01

    With the use of fuzzy logic techniques, neural computing can be integrated in symbolic reasoning to solve complex real world problems. In fact, artificial neural networks, expert systems, and fuzzy logic systems, in the context of approximate reasoning, share common features and techniques. A model of Fuzzy Connectionist Expert System is introduced, in which an artificial neural network is designed to construct the knowledge base of an expert system from, training examples (this model can also be used for specifications of rules in fuzzy logic control). Two types of weights are associated with the synaptic connections in an AND-OR structure: primary linguistic weights, interpreted as labels of fuzzy sets, and secondary numerical weights. Cell activation is computed through min-max fuzzy equations of the weights. Learning consists in finding the (numerical) weights and the network topology. This feedforward network is described and first illustrated in a biomedical application (medical diagnosis assistance from inflammatory-syndromes/proteins profiles). Then, it is shown how this methodology can be utilized for handwritten pattern recognition (characters play the role of diagnoses): in a fuzzy neuron describing a number for example, the linguistic weights represent fuzzy sets on cross-detecting lines and the numerical weights reflect the importance (or weakness) of connections between cross-detecting lines and characters.

  11. A fuzzy Petri-net-based mode identification algorithm for fault diagnosis of complex systems

    NASA Astrophysics Data System (ADS)

    Propes, Nicholas C.; Vachtsevanos, George

    2003-08-01

    Complex dynamical systems such as aircraft, manufacturing systems, chillers, motor vehicles, submarines, etc. exhibit continuous and event-driven dynamics. These systems undergo several discrete operating modes from startup to shutdown. For example, a certain shipboard system may be operating at half load or full load or may be at start-up or shutdown. Of particular interest are extreme or "shock" operating conditions, which tend to severely impact fault diagnosis or the progression of a fault leading to a failure. Fault conditions are strongly dependent on the operating mode. Therefore, it is essential that in any diagnostic/prognostic architecture, the operating mode be identified as accurately as possible so that such functions as feature extraction, diagnostics, prognostics, etc. can be correlated with the predominant operating conditions. This paper introduces a mode identification methodology that incorporates both time- and event-driven information about the process. A fuzzy Petri net is used to represent the possible successive mode transitions and to detect events from processed sensor signals signifying a mode change. The operating mode is initialized and verified by analysis of the time-driven dynamics through a fuzzy logic classifier. An evidence combiner module is used to combine the results from both the fuzzy Petri net and the fuzzy logic classifier to determine the mode. Unlike most event-driven mode identifiers, this architecture will provide automatic mode initialization through the fuzzy logic classifier and robustness through the combining of evidence of the two algorithms. The mode identification methodology is applied to an AC Plant typically found as a component of a shipboard system.

  12. Using soft systems methodology to develop a simulation of out-patient services.

    PubMed

    Lehaney, B; Paul, R J

    1994-10-01

    Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.

  13. The work of Galileo and conformation of the experiment in physics

    NASA Astrophysics Data System (ADS)

    Alvarez, J. L.; Posadas, Y.

    2003-02-01

    It is very frequent to find comments and references to Galileo's work suggesting that he based his affirmations on a logic thought and not on observations. In this paper we present an analysis of some experiments that he realized and were unknown in the XVI and XVII centuries; in they we find a clear description of the methodology that Galileo follows in order to reach the results that he presents in his formal work, particularly in Discorsi. In contrast with the Aristotelian philosophy, in these manuscripts Galileo adopt a methodology with which he obtain great contributions for the modem conformation of the experimental method, founding so a methodology for the study of the movement. We use this analysis as an example of the difficulties that are present in the conformation of the modem experimentation and we point out the necessity to stress the importance of the scientific methodology in the teaching of physics.

  14. Can a Classroom Be a Family? Race, Space, and the Labour of Care in Urban Teaching

    ERIC Educational Resources Information Center

    Gallagher, Kathleen

    2016-01-01

    This article reports on findings from a case study of an eighth-grade teacher in an innercity school in downtown Toronto, Canada. It investigates the teacher's pedagogical use of the metaphor of "family," using interview data to underscore the effects produced by such an operating logic in a classroom. Methodologically, the article puts…

  15. Game-Based Learning: Increasing the Logical-Mathematical, Naturalistic, and Linguistic Learning Levels of Primary School Students

    ERIC Educational Resources Information Center

    del Moral Pérez, M. Esther; Duque, Alba P. Guzmán; García, L. Carlota Fernández

    2018-01-01

    Game-based learning is an innovative methodology that takes advantage of the educational potential offered by videogames in general and serious games in particular to boost training processes, thus making it easier for users to achieve motivated learning. The present paper focuses on the description of the Game to Learn Project, which has as its…

  16. Qualitative Research and Educational Leadership: Essential Dynamics to Consider When Designing and Conducting Studies

    ERIC Educational Resources Information Center

    Brooks, Jeffrey S.; Normore, Anthony H.

    2015-01-01

    Purpose: The purpose of this paper is to highlight issues relayed to appropriate design and conduct of qualitative studies in educational leadership. Design/Methodology/Approach: The paper is a conceptual/logical argument that centers around the notion that while scholars in the field have at times paid attention to such dynamics, it is important…

  17. Methodological and Epistemological Issues on Linear Regression Applied to Psychometric Variables in Problem Solving: Rethinking Variance

    ERIC Educational Resources Information Center

    Stamovlasis, Dimitrios

    2010-01-01

    The aim of the present paper is two-fold. First, it attempts to support previous findings on the role of some psychometric variables, such as, M-capacity, the degree of field dependence-independence, logical thinking and the mobility-fixity dimension, on students' achievement in chemistry problem solving. Second, the paper aims to raise some…

  18. Multivariable Techniques for High-Speed Research Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Newman, Brett A.

    1999-01-01

    This report describes the activities and findings conducted under contract with NASA Langley Research Center. Subject matter is the investigation of suitable multivariable flight control design methodologies and solutions for large, flexible high-speed vehicles. Specifically, methodologies are to address the inner control loops used for stabilization and augmentation of a highly coupled airframe system possibly involving rigid-body motion, structural vibrations, unsteady aerodynamics, and actuator dynamics. Design and analysis techniques considered in this body of work are both conventional-based and contemporary-based, and the vehicle of interest is the High-Speed Civil Transport (HSCT). Major findings include: (1) control architectures based on aft tail only are not well suited for highly flexible, high-speed vehicles, (2) theoretical underpinnings of the Wykes structural mode control logic is based on several assumptions concerning vehicle dynamic characteristics, and if not satisfied, the control logic can break down leading to mode destabilization, (3) two-loop control architectures that utilize small forward vanes with the aft tail provide highly attractive and feasible solutions to the longitudinal axis control challenges, and (4) closed-loop simulation sizing analyses indicate the baseline vane model utilized in this report is most likely oversized for normal loading conditions.

  19. Software Process Assurance for Complex Electronics (SPACE)

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.

  20. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  1. Representation of molecular structure using quantum topology with inductive logic programming in structure-activity relationships.

    PubMed

    Buttingsrud, Bård; Ryeng, Einar; King, Ross D; Alsberg, Bjørn K

    2006-06-01

    The requirement of aligning each individual molecule in a data set severely limits the type of molecules which can be analysed with traditional structure activity relationship (SAR) methods. A method which solves this problem by using relations between objects is inductive logic programming (ILP). Another advantage of this methodology is its ability to include background knowledge as 1st-order logic. However, previous molecular ILP representations have not been effective in describing the electronic structure of molecules. We present a more unified and comprehensive representation based on Richard Bader's quantum topological atoms in molecules (AIM) theory where critical points in the electron density are connected through a network. AIM theory provides a wealth of chemical information about individual atoms and their bond connections enabling a more flexible and chemically relevant representation. To obtain even more relevant rules with higher coverage, we apply manual postprocessing and interpretation of ILP rules. We have tested the usefulness of the new representation in SAR modelling on classifying compounds of low/high mutagenicity and on a set of factor Xa inhibitors of high and low affinity.

  2. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach

    PubMed Central

    Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.

    2013-01-01

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756

  4. Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.

    PubMed

    Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H

    2013-01-09

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. A Rule-Based System Implementing a Method for Translating FOL Formulas into NL Sentences

    NASA Astrophysics Data System (ADS)

    Mpagouli, Aikaterini; Hatzilygeroudis, Ioannis

    In this paper, we mainly present the implementation of a system that translates first order logic (FOL) formulas into natural language (NL) sentences. The motivation comes from an intelligent tutoring system teaching logic as a knowledge representation language, where it is used as a means for feedback to the students-users. FOL to NL conversion is achieved by using a rule-based approach, where we exploit the pattern matching capabilities of rules. So, the system consists of rule-based modules corresponding to the phases of our translation methodology. Facts are used in a lexicon providing lexical and grammatical information that helps in producing the NL sentences. The whole system is implemented in Jess, a java-implemented rule-based programming tool. Experimental results confirm the success of our choices.

  6. Multidimensional Simulation Applied to Water Resources Management

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Ferreira, F. C.; Loucks, D. P.; Seixas, M. J.

    1990-09-01

    A framework for an integrated decision aiding simulation (IDEAS) methodology using numerical, linguistic, and pictorial entities and operations is introduced. IDEAS relies upon traditional numerical formulations, logical rules to handle linguistic entities with linguistic values, and a set of pictorial operations. Pictorial entities are defined by their shape, size, color, and position. Pictorial operators include reproduction (copy of a pictorial entity), mutation (expansion, rotation, translation, change in color), fertile encounters (intersection, reunion), and sterile encounters (absorption). Interaction between numerical, linguistic, and pictorial entities is handled through logical rules or a simplified vector calculus operation. This approach is shown to be applicable to various environmental and water resources management analyses using a model to assess the impacts of an oil spill. Future developments, including IDEAS implementation on parallel processing machines, are also discussed.

  7. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  8. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  9. Results of the Software Process Improvement Efforts of the Early Adopters in NAVAIR 4.0

    DTIC Science & Technology

    2007-12-01

    and customer satisfaction. AIRSpeed utilizes a structured, problem solving methodology called DMAIC (Define, Measure, Analyze, Improve, Control...widely used in business. DMAIC leads project teams through the logical steps from problem definition to problem resolution. Each phase has a specific set...costs and improving productivity and customer satisfaction. AIRSpeed utilizes the DMAIC (Define, Measure, Analyze, Improve, Control) structured problem

  10. Goldratt's Thinking Process Applied to the Problems Associated with Trained Employee Retention in a Highly Competitive Labor Market

    ERIC Educational Resources Information Center

    Taylor, Lloyd J., III; Poyner, Ilene

    2008-01-01

    Purpose: This study aims to investigate the problem of trained employee retention in a highly competitive labor market for a manufacturing facility in the oilfields of West Texas. Design/methodology/approach: This article examines how one manufacturing facility should be able to retain their trained employees by using the logic of Eliyahu M.…

  11. Toward methodological emancipation in applied health research.

    PubMed

    Thorne, Sally

    2011-04-01

    In this article, I trace the historical groundings of what have become methodological conventions in the use of qualitative approaches to answer questions arising from the applied health disciplines and advocate an alternative logic more strategically grounded in the epistemological orientations of the professional health disciplines. I argue for an increasing emphasis on the modification of conventional qualitative approaches to the particular knowledge demands of the applied practice domain, challenging the merits of what may have become unwarranted attachment to theorizing. Reorienting our methodological toolkits toward the questions arising within an evidence-dominated policy agenda, I encourage my applied health disciplinary colleagues to make themselves useful to that larger project by illuminating that which quantitative research renders invisible, problematizing the assumptions on which it generates conclusions, and filling in the gaps in knowledge needed to make decisions on behalf of people and populations.

  12. Using fuzzy logic analysis for siting decisions of infiltration trenches for highway runoff control.

    PubMed

    Ki, Seo Jin; Ray, Chittaranjan

    2014-09-15

    Determining optimal locations for best management practices (BMPs), including their field considerations and limitations, plays an important role for effective stormwater management. However, these issues have been often overlooked in modeling studies that focused on downstream water quality benefits. This study illustrates the methodology of locating infiltration trenches at suitable locations from spatial overlay analyses which combine multiple layers that address different aspects of field application into a composite map. Using seven thematic layers for each analysis, fuzzy logic was employed to develop a site suitability map for infiltration trenches, whereas the DRASTIC method was used to produce a groundwater vulnerability map on the island of Oahu, Hawaii, USA. In addition, the analytic hierarchy process (AHP), one of the most popular overlay analyses, was used for comparison to fuzzy logic. The results showed that the AHP and fuzzy logic methods developed significantly different index maps in terms of best locations and suitability scores. Specifically, the AHP method provided a maximum level of site suitability due to its inherent aggregation approach of all input layers in a linear equation. The most eligible areas in locating infiltration trenches were determined from the superposition of the site suitability and groundwater vulnerability maps using the fuzzy AND operator. The resulting map successfully balanced qualification criteria for a low risk of groundwater contamination and the best BMP site selection. The results of the sensitivity analysis showed that the suitability scores were strongly affected by the algorithms embedded in fuzzy logic; therefore, caution is recommended with their use in overlay analysis. Accordingly, this study demonstrates that the fuzzy logic analysis can not only be used to improve spatial decision quality along with other overlay approaches, but also is combined with general water quality models for initial and refined searches for the best locations of BMPs at the sub-basin level. Copyright © 2014. Published by Elsevier B.V.

  13. Students who developed logical reasoning skills reported improved confidence in drug dose calculation: Feedback from remedial maths classes.

    PubMed

    Shelton, Chris

    2016-06-01

    The safe administration of drugs is a focus of attention in healthcare. It is regarded as acceptable that a formula card or mnemonic can be used to find the correct dose and fill a prescription even though this removes any requirement for performing the underlying computation. Feedback and discussion in class reveal that confidence in arithmetic skills can be low even when students are able to pass the end of semester drug calculation exam. To see if confidence in the understanding and performance of arithmetic for drug calculations can be increased by emphasising student's innate powers of logical reasoning after reflection. Remedial classes offered for students who have declared a dislike or lack of confidence in arithmetic have been developed from student feedback adopting a reasoning by logical step methodology. Students who gave up two hours of their free learning time were observed to engage seriously with the learning methods, focussing on the innate ability to perform logical reasoning necessary for drug calculation problems. Working in small groups allowed some discussion of the route to the answer and this was followed by class discussion and reflection. The results were recorded as weekly self-assessment scores for confidence in calculation. A self-selecting group who successfully completed the end of semester drug calculation exam reported low to moderate confidence in arithmetic. After four weeks focussing on logical skills a significant increase in self-belief was measured. This continued to rise in students who remained in the classes. Many students hold a negative belief regarding their own mathematical abilities. This restricts the learning of arithmetic skills making alternate routes using mnemonics and memorised steps an attractive alternative. Practising stepwise logical reasoning skills consolidated by personal reflection has been effective in developing student's confidence and awareness of their innate powers of deduction supporting an increase in competence in drug administration. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Shall We Climb on the Shoulders of the Giants to Extend the Reality Horizon of Physics?

    NASA Astrophysics Data System (ADS)

    Roychoudhuri, Chandrasekhar

    2007-12-01

    After a very successful flurry of activities for a few decades to maximize the benefits of the formalism of Quantum Mechanics to connect the micro and macro universe, the applied physics community has successfully engineered sustained technological innovations for human social advancements. However, a significant segment of the theoretical physics community put their endeavors essentially in inventing realities that are esthetically pleasing to our human logics (epistemology) rather than staying focused on discovering the actual physical realities in nature driven by cosmic logics (ontology). The purpose of this paper is an attempt to formulate a Reality Epistemology that can leverage our enormous successes in science to re-focus our attention to discovering nature's realities by understanding the physical processes behind all natural interactions that collectively make the cosmic evolution progressing forward. We underscore the deviation from seeking reality to justify the key premise of the paper. We can ``see'' (measure) the universe only through the ``eyes'' of the various sensors (detectors). None of these sensors are completely known to us as yet. All sensors also have inherently limited capabilities to respond to input signals and limited capabilities to ``report'' only a part of all that it experiences. We are thus forced to develop our mathematical theories mixing our human logics and incomplete information and hence they are all provisional and incomplete since they are predicting only correctly measured but limited report by the detector. Thus, we should be careful not to jump into conclusion that we have captured all the necessary cosmic logics behind the interactions involved. We dissect the measurement process in a generic way along with well defined steps to apply Reality Epistemology, which will jointly allow us to develop a scientific methodology of iteratively refining our ``successful'' human logics that can evolve towards our goal of capturing the cosmic logics. The core content of this paper was first presented at the 2007 QTRF-4 conference at the Vaxjo University [1].

  15. Selection Shapes Transcriptional Logic and Regulatory Specialization in Genetic Networks

    PubMed Central

    Fogelmark, Karl; Peterson, Carsten; Troein, Carl

    2016-01-01

    Background Living organisms need to regulate their gene expression in response to environmental signals and internal cues. This is a computational task where genes act as logic gates that connect to form transcriptional networks, which are shaped at all scales by evolution. Large-scale mutations such as gene duplications and deletions add and remove network components, whereas smaller mutations alter the connections between them. Selection determines what mutations are accepted, but its importance for shaping the resulting networks has been debated. Methodology To investigate the effects of selection in the shaping of transcriptional networks, we derive transcriptional logic from a combinatorially powerful yet tractable model of the binding between DNA and transcription factors. By evolving the resulting networks based on their ability to function as either a simple decision system or a circadian clock, we obtain information on the regulation and logic rules encoded in functional transcriptional networks. Comparisons are made between networks evolved for different functions, as well as with structurally equivalent but non-functional (neutrally evolved) networks, and predictions are validated against the transcriptional network of E. coli. Principal Findings We find that the logic rules governing gene expression depend on the function performed by the network. Unlike the decision systems, the circadian clocks show strong cooperative binding and negative regulation, which achieves tight temporal control of gene expression. Furthermore, we find that transcription factors act preferentially as either activators or repressors, both when binding multiple sites for a single target gene and globally in the transcriptional networks. This separation into positive and negative regulators requires gene duplications, which highlights the interplay between mutation and selection in shaping the transcriptional networks. PMID:26927540

  16. Introducing a methodology for estimating duration of surgery in health services research.

    PubMed

    Redelmeier, Donald A; Thiruchelvam, Deva; Daneman, Nick

    2008-09-01

    The duration of surgery is an indicator for the quality, risks, and efficiency of surgical procedures. We introduce a new methodology for assessing the duration of surgery based on anesthesiology billing records, along with reviewing its fundamental logic and limitations. The validity of the methodology was assessed through a population-based cohort of patients (n=480,986) undergoing elective operations in 246 Ontario hospitals with 1,084 anesthesiologists between April 1, 1992 and March 31, 2002 (10 years). The weaknesses of the methodology relate to missing data, self-serving exaggerations by providers, imprecisions from clinical diversity, upper limits due to accounting regulations, fluctuations from updates over the years, national differences in reimbursement schedules, and the general failings of claims base analyses. The strengths of the methodology are in providing data that match clinical experiences, correspond to chart review, are consistent over time, can detect differences where differences would be anticipated, and might have implications for examining patient outcomes after long surgical times. We suggest that an understanding and application of large studies of surgical duration may help scientists explore selected questions concerning postoperative complications.

  17. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  18. A goal programming approach for a joint design of macroeconomic and environmental policies: a methodological proposal and an application to the Spanish economy.

    PubMed

    André, Francisco J; Cardenete, M Alejandro; Romero, Carlos

    2009-05-01

    The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.

  19. Control of autonomous ground vehicles: a brief technical review

    NASA Astrophysics Data System (ADS)

    Babak, Shahian-Jahromi; Hussain, Syed A.; Karakas, Burak; Cetin, Sabri

    2017-07-01

    This paper presents a brief review of the developments achieved in autonomous vehicle systems technology. A concise history of autonomous driver assistance systems is presented, followed by a review of current state of the art sensor technology used in autonomous vehicles. Standard sensor fusion method that has been recently explored is discussed. Finally, advances in embedded software methodologies that define the logic between sensory information and actuation decisions are reviewed.

  20. The futility study—progress over the last decade

    PubMed Central

    Levin, Bruce

    2015-01-01

    We review the futility clinical trial design (also known as the non-superiority design) with respect to its emergence and methodologic developments over the last decade, especially in regard to its application to clinical trials for neurological disorders. We discuss the design’s strengths as a programmatic screening device to weed out unpromising new treatments, its limitations and pitfalls, and a recent critique of the logic of the method. PMID:26123873

  1. Radiation Status of Sub-65 nm Electronics

    NASA Technical Reports Server (NTRS)

    Pellish, Jonathan A.

    2011-01-01

    Ultra-scaled complementary metal oxide semiconductor (CMOS) includes commercial foundry capabilities at and below the 65 nm technology node Radiation evaluations take place using standard products and test characterization vehicles (memories, logic/latch chains, etc.) NEPP focus is two-fold: (1) Conduct early radiation evaluations to ascertain viability for future NASA missions (i.e. leverage commercial technology development). (2) Uncover gaps in current testing methodologies and mechanism comprehension -- early risk mitigation.

  2. A Design Methodology for Optoelectronic VLSI

    DTIC Science & Technology

    2007-01-01

    current gets converted to a CMOS voltage level through a transimpedance amplifier circuit called a receiver. The output of the receiver is then...change the current flowing from the diode to a voltage that the logic inputs can use. That circuit is called a receiver. It is a transimpedance amplifier ...incorpo- rate random access memory circuits, SRAM or dynamic RAM (DRAM). These circuits use weak internal analog signals that are amplified by sense

  3. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  4. Exact Synthesis of Reversible Circuits Using A* Algorithm

    NASA Astrophysics Data System (ADS)

    Datta, K.; Rathi, G. K.; Sengupta, I.; Rahaman, H.

    2015-06-01

    With the growing emphasis on low-power design methodologies, and the result that theoretical zero power dissipation is possible only if computations are information lossless, design and synthesis of reversible logic circuits have become very important in recent years. Reversible logic circuits are also important in the context of quantum computing, where the basic operations are reversible in nature. Several synthesis methodologies for reversible circuits have been reported. Some of these methods are termed as exact, where the motivation is to get the minimum-gate realization for a given reversible function. These methods are computationally very intensive, and are able to synthesize only very small functions. There are other methods based on function transformations or higher-level representation of functions like binary decision diagrams or exclusive-or sum-of-products, that are able to handle much larger circuits without any guarantee of optimality or near-optimality. Design of exact synthesis algorithms is interesting in this context, because they set some kind of benchmarks against which other methods can be compared. This paper proposes an exact synthesis approach based on an iterative deepening version of the A* algorithm using the multiple-control Toffoli gate library. Experimental results are presented with comparisons with other exact and some heuristic based synthesis approaches.

  5. Assessing the reliability and validity of anti-tobacco attitudes/beliefs in the context of a campaign strategy.

    PubMed

    Arheart, Kristopher L; Sly, David F; Trapido, Edward J; Rodriguez, Richard D; Ellestad, Amy J

    2004-11-01

    To identify multi-item attitude/belief scales associated with the theoretical foundations of an anti-tobacco counter-marketing campaign and assess their reliability and validity. The data analyzed are from two state-wide, random, cross-sectional telephone surveys [n(S1)=1,079, n(S2)=1,150]. Items forming attitude/belief scales are identified using factor analysis. Reliability is assessed with Chronbach's alpha. Relationships among scales are explored using Pearson correlation. Validity is assessed by testing associations derived from the Centers for Disease Control and Prevention's (CDC) logic model for tobacco control program development and evaluation linking media exposure to attitudes/beliefs, and attitudes/beliefs to smoking-related behaviors. Adjusted odds ratios are employed for these analyses. Three factors emerged: traditional attitudes/beliefs about tobacco and tobacco use, tobacco industry manipulation and anti-tobacco empowerment. Reliability coefficients are in the range of 0.70 and vary little between age groups. The factors are correlated with one-another as hypothesized. Associations between media exposure and the attitude/belief scales and between these scales and behaviors are consistent with the CDC logic model. Using reliable, valid multi-item scales is theoretically and methodologically more sound than employing single-item measures of attitudes/beliefs. Methodological, theoretical and practical implications are discussed.

  6. Structural knowledge learning from maps for supervised land cover/use classification: Application to the monitoring of land cover/use maps in French Guiana

    NASA Astrophysics Data System (ADS)

    Bayoudh, Meriam; Roux, Emmanuel; Richard, Gilles; Nock, Richard

    2015-03-01

    The number of satellites and sensors devoted to Earth observation has become increasingly elevated, delivering extensive data, especially images. At the same time, the access to such data and the tools needed to process them has considerably improved. In the presence of such data flow, we need automatic image interpretation methods, especially when it comes to the monitoring and prediction of environmental and societal changes in highly dynamic socio-environmental contexts. This could be accomplished via artificial intelligence. The concept described here relies on the induction of classification rules that explicitly take into account structural knowledge, using Aleph, an Inductive Logic Programming (ILP) system, combined with a multi-class classification procedure. This methodology was used to monitor changes in land cover/use of the French Guiana coastline. One hundred and fifty-eight classification rules were induced from 3 diachronic land cover/use maps including 38 classes. These rules were expressed in first order logic language, which makes them easily understandable by non-experts. A 10-fold cross-validation gave significant average values of 84.62%, 99.57% and 77.22% for classification accuracy, specificity and sensitivity, respectively. Our methodology could be beneficial to automatically classify new objects and to facilitate object-based classification procedures.

  7. Fractal dimension and fuzzy logic systems for broken rotor bar detection in induction motors at start-up and steady-state regimes

    NASA Astrophysics Data System (ADS)

    Amezquita-Sanchez, Juan P.; Valtierra-Rodriguez, Martin; Perez-Ramirez, Carlos A.; Camarena-Martinez, David; Garcia-Perez, Arturo; Romero-Troncoso, Rene J.

    2017-07-01

    Squirrel-cage induction motors (SCIMs) are key machines in many industrial applications. In this regard, the monitoring of their operating condition aiming at avoiding damage and reducing economical losses has become a demanding task for industry. In the literature, several techniques and methodologies to detect faults that affect the integrity and performance of SCIMs have been proposed. However, they have only been focused on analyzing either the start-up transient or the steady-state operation regimes, two common operating scenarios in real practice. In this work, a novel methodology for broken rotor bar (BRB) detection in SCIMs during both start-up and steady-state operation regimes is proposed. It consists of two main steps. In the first one, the analysis of three-axis vibration signals using fractal dimension (FD) theory is carried out. Since different FD-based algorithms can give different results, three algorithms named Katz’ FD, Higuchi’s FD, and box dimension, are tested. In the second step, a fuzzy logic system for each regime is presented for automatic diagnosis. To validate the proposal, a motor with different damage levels has been tested: one with a partially BRB, a second with one fully BRB, and the third with two BRBs. The obtained results demonstrate the proposed effectiveness.

  8. Toward a comprehensive areal model of earthquake-induced landslides

    USGS Publications Warehouse

    Miles, S.B.; Keefer, D.K.

    2009-01-01

    This paper provides a review of regional-scale modeling of earthquake-induced landslide hazard with respect to the needs for disaster risk reduction and sustainable development. Based on this review, it sets out important research themes and suggests computing with words (CW), a methodology that includes fuzzy logic systems, as a fruitful modeling methodology for addressing many of these research themes. A range of research, reviewed here, has been conducted applying CW to various aspects of earthquake-induced landslide hazard zonation, but none facilitate comprehensive modeling of all types of earthquake-induced landslides. A new comprehensive areal model of earthquake-induced landslides (CAMEL) is introduced here that was developed using fuzzy logic systems. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL is highly modifiable and adaptable; new knowledge can be easily added, while existing knowledge can be changed to better match local knowledge and conditions. As such, CAMEL should not be viewed as a complete alternative to other earthquake-induced landslide models. CAMEL provides an open framework for incorporating other models, such as Newmark's displacement method, together with previously incompatible empirical and local knowledge. ?? 2009 ASCE.

  9. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  10. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.

  11. Initiator-catalyzed self-assembly of duplex-looped DNA hairpin motif based on strand displacement reaction for logic operations and amplified biosensing.

    PubMed

    Bi, Sai; Yue, Shuzhen; Wu, Qiang; Ye, Jiayan

    2016-09-15

    Here we program an initiator-catalyzed self-assembly of duplex-looped DNA hairpin motif based on strand displacement reaction. Due to the recycling of initiator and performance in a cascade manner, this system is versatilely extended to logic operations, including the construction of concatenated logic circuits with a feedback function and a biocomputing keypad-lock security system. Compared with previously reported molecular security systems, the prominent feature of our keypad lock is that it can be spontaneously reset and recycled with no need of any external stimulus and human intervention. Moreover, through integrating with an isothermal amplification technique of rolling circle amplification (RCA), this programming catalytic DNA self-assembly strategy readily achieves sensitive and selective biosensing of initiator. Importantly, a magnetic graphene oxide (MGO) is introduced to remarkably reduced background, which plays an important role in enhancing the signal-to-noise ratio and improving the detection sensitivity. Therefore, the proposed sophisticated DNA strand displacement-based methodology with engineering dynamic functions may find broad applications in the construction of programming DNA nanostructures, amplification biosensing platform, and large-scale DNA circuits. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Contemporary leadership in healthcare organizations: fragmented or concurrent leadership.

    PubMed

    Wikstrøm, Ewa; Dellve, Lotta

    2009-01-01

    The purpose of this paper is to gain a deeper understanding of the main contemporary challenges for healthcare leaders in their everyday work practice, and the support they need to master their experienced dilemmas. Qualitative in-depth interviews (n=52), and focus-group interviews (n=6) with 31 first-line and 45 second-line healthcare leaders are analysed in line with constructivist grounded theory. In this paper, two leadership models are proposed for defining and differentiating ways of meeting different logics and demands made on leaders in the healthcare sector. The first model is leadership by separating different logics and fragmentation of time. Here, leaders express a desire for support in defining, structuring, dividing, and allocating tasks. The second model is leadership by integrating different logics and currentness of solutions. In this case, leaders want support in strengthening proactive leadership and shaping the basis for participative employeeship. This research is designed to describe what people experience rather than to assess the frequency of that experience in the studied settings. However, it would be interesting to elaborate on the findings of this study using other research methodologies. The findings contribute to contextual knowledge that is of relevance in supporting healthcare leaders. This is helpful in identifying important conditions that support the establishment of leadership and employeeship, leading to improvements in healthcare practice. The paper describes how contemporary leadership in the healthcare sector is constituted through different strategies for meeting multiple logics.

  13. Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence

    PubMed Central

    Han, Paul K. J.

    2014-01-01

    The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891

  14. Could Bertrand Russell's barber have bitten his own teeth? A problem of logic and definitions.

    PubMed

    Aitken, Kenneth John

    2014-08-01

    Guiding the positive evolution of behavior is an admirable goal. Wilson et al.'s arguments are based largely on studies of problem correction. The methodology is sound, but not the post hoc ergo procter hoc extrapolation. What is required is evidence that it can proactively generate positive change. The evolution of human behavior to date has been affected by many factors that include unmalleable and unpredicted environmental changes.

  15. Biomedical implications of information processing in chemical systems: non-classical approach to photochemistry of coordination compounds.

    PubMed

    Szaciłowski, Konrad

    2007-01-01

    Analogies between photoactive nitric oxide generators and various electronic devices: logic gates and operational amplifiers are presented. These analogies have important biological consequences: application of control parameters allows for better targeting and control of nitric oxide drugs. The same methodology may be applied in the future for other therapeutic strategies and at the same time helps to understand natural regulatory and signaling processes in biological systems.

  16. Northeast Artificial Intelligence Consortium Annual Report. 1988 Interference Techniques for Knowledge Base Maintenance Using Logic Programming Methodologies. Volume 11

    DTIC Science & Technology

    1989-10-01

    Northeast Aritificial Intelligence Consortium (NAIC). i Table of Contents Execu tive Sum m ary...o g~nIl ’vLr COPY o~ T- RADC-TR-89-259, Vol XI (of twelve) N Interim Report SOctober 1989 NORTHEAST ARTIFICIAL INTELLIGENCE CONSORTIUM ANNUAL REPORT...ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION Northeast Artificial (If applicable) Intelligence Consortium (NAIC) . Rome Air Development

  17. A Methodology for Formal Hardware Verification, with Application to Microprocessors.

    DTIC Science & Technology

    1993-08-29

    concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as

  18. Digital electronic engine control fault detection and accommodation flight evaluation

    NASA Technical Reports Server (NTRS)

    Baer-Ruedhart, J. L.

    1984-01-01

    The capabilities and performance of various fault detection and accommodation (FDA) schemes in existing and projected engine control systems were investigated. Flight tests of the digital electronic engine control (DEEC) in an F-15 aircraft show discrepancies between flight results and predictions based on simulation and altitude testing. The FDA methodology and logic in the DEEC system, and the results of the flight failures which occurred to date are described.

  19. User Expectations: Nurses' Perspective.

    PubMed

    Gürsel, Güney

    2016-01-01

    Healthcare is a technology-intensive industry. Although all healthcare staff needs qualified computer support, physicians and nurses need more. As nursing practice is an information intensive issue, understanding nurses' expectations from healthcare information systems (HCIS) is a must issue to meet their needs and help them in a better way. In this study perceived importance of nurses' expectations from HCIS is investigated, and two HCIS is evaluated for meeting the expectations of nurses by using fuzzy logic methodologies.

  20. Laudan's normative naturalism: a useful philosophy of science for psychology.

    PubMed

    Capaldi, E J; Proctor, R W

    2000-01-01

    Logical positivism, widely regarded as the received epistemology of psychology in the first half of the 20th century, was supplanted in the 1960s by various postpositivistic, relativistic philosophies of science, most notably that of Kuhn. Recently, Laudan, a major figure in the philosophy of science, developed a novel approach called normative naturalism that provides an alternative to positivism and relativism. His central thesis is that the two are not always on opposite ends of a continuum but rather have many assumptions in common. This article brings Laudan's important views to the attention of psychologists and describes some of the unique implications of these views for the conduct of research and theory in psychology. These implications, which follow from a number of closely reasoned pragmatic arguments, include more realistic and appropriate evaluation of theory and methodology than has been suggested by logical positivism or relativism.

  1. Evaluation of Fuzzy-Logic Framework for Spatial Statistics Preserving Methods for Estimation of Missing Precipitation Data

    NASA Astrophysics Data System (ADS)

    El Sharif, H.; Teegavarapu, R. S.

    2012-12-01

    Spatial interpolation methods used for estimation of missing precipitation data at a site seldom check for their ability to preserve site and regional statistics. Such statistics are primarily defined by spatial correlations and other site-to-site statistics in a region. Preservation of site and regional statistics represents a means of assessing the validity of missing precipitation estimates at a site. This study evaluates the efficacy of a fuzzy-logic methodology for infilling missing historical daily precipitation data in preserving site and regional statistics. Rain gauge sites in the state of Kentucky, USA, are used as a case study for evaluation of this newly proposed method in comparison to traditional data infilling techniques. Several error and performance measures will be used to evaluate the methods and trade-offs in accuracy of estimation and preservation of site and regional statistics.

  2. Methodology for the systems engineering process. Volume 2: Technical parameters

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A scheme based on starting the logic networks from the development and mission factors that are of primary concern in an aerospace system is described. This approach required identifying the primary states (design, design verification, premission, mission, postmission), identifying the attributes within each state (performance capability, survival, evaluation, operation, etc), and then developing the generic relationships of variables for each branch. To illustrate this concept, a system was used that involved a launch vehicle and payload for an earth orbit mission. Examination showed that this example was sufficient to illustrate the concept. A more complicated mission would follow the same basic approach, but would have more extensive sets of generic trees and more correlation points between branches. It has been shown that in each system state (production, test, and use), a logic could be developed to order and classify the parameters involved in the translation from general requirements to specific requirements for system elements.

  3. Modeling a description logic vocabulary for cancer research.

    PubMed

    Hartel, Frank W; de Coronado, Sherri; Dionne, Robert; Fragoso, Gilberto; Golbeck, Jennifer

    2005-04-01

    The National Cancer Institute has developed the NCI Thesaurus, a biomedical vocabulary for cancer research, covering terminology across a wide range of cancer research domains. A major design goal of the NCI Thesaurus is to facilitate translational research. We describe: the features of Ontylog, a description logic used to build NCI Thesaurus; our methodology for enhancing the terminology through collaboration between ontologists and domain experts, and for addressing certain real world challenges arising in modeling the Thesaurus; and finally, we describe the conversion of NCI Thesaurus from Ontylog into Web Ontology Language Lite. Ontylog has proven well suited for constructing big biomedical vocabularies. We have capitalized on the Ontylog constructs Kind and Role in the collaboration process described in this paper to facilitate communication between ontologists and domain experts. The artifacts and processes developed by NCI for collaboration may be useful in other biomedical terminology development efforts.

  4. Implementation of Complex Biological Logic Circuits Using Spatially Distributed Multicellular Consortia

    PubMed Central

    Urrios, Arturo; de Nadal, Eulàlia; Solé, Ricard; Posas, Francesc

    2016-01-01

    Engineered synthetic biological devices have been designed to perform a variety of functions from sensing molecules and bioremediation to energy production and biomedicine. Notwithstanding, a major limitation of in vivo circuit implementation is the constraint associated to the use of standard methodologies for circuit design. Thus, future success of these devices depends on obtaining circuits with scalable complexity and reusable parts. Here we show how to build complex computational devices using multicellular consortia and space as key computational elements. This spatial modular design grants scalability since its general architecture is independent of the circuit’s complexity, minimizes wiring requirements and allows component reusability with minimal genetic engineering. The potential use of this approach is demonstrated by implementation of complex logical functions with up to six inputs, thus demonstrating the scalability and flexibility of this method. The potential implications of our results are outlined. PMID:26829588

  5. Designing Class Methods from Dataflow Diagrams

    NASA Astrophysics Data System (ADS)

    Shoval, Peretz; Kabeli-Shani, Judith

    A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.

  6. Data cleaning methodology for monthly water-to-oil and water-to-gas production ratios in continuous resource assessments

    USGS Publications Warehouse

    Varela, Brian A.; Haines, Seth S.; Gianoutsos, Nicholas J.

    2017-01-19

    Petroleum production data are usually stored in a format that makes it easy to determine the year and month production started, if there are any breaks, and when production ends. However, in some cases, you may want to compare production runs where the start of production for all wells starts at month one regardless of the year the wells started producing. This report describes the JAVA program the U.S. Geological Survey developed to examine water-to-oil and water-to-gas ratios in the form of month 1, month 2, and so on with the objective of estimating quantities of water and proppant used in low-permeability petroleum production. The text covers the data used by the program, the challenges with production data, the program logic for checking the quality of the production data, and the program logic for checking the completeness of the data.

  7. Evolutionary fuzzy modeling human diagnostic decisions.

    PubMed

    Peña-Reyes, Carlos Andrés

    2004-05-01

    Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.

  8. Segmentation of medical images using explicit anatomical knowledge

    NASA Astrophysics Data System (ADS)

    Wilson, Laurie S.; Brown, Stephen; Brown, Matthew S.; Young, Jeanne; Li, Rongxin; Luo, Suhuai; Brandt, Lee

    1999-07-01

    Knowledge-based image segmentation is defined in terms of the separation of image analysis procedures and representation of knowledge. Such architecture is particularly suitable for medical image segmentation, because of the large amount of structured domain knowledge. A general methodology for the application of knowledge-based methods to medical image segmentation is described. This includes frames for knowledge representation, fuzzy logic for anatomical variations, and a strategy for determining the order of segmentation from the modal specification. This method has been applied to three separate problems, 3D thoracic CT, chest X-rays and CT angiography. The application of the same methodology to such a range of applications suggests a major role in medical imaging for segmentation methods incorporating representation of anatomical knowledge.

  9. Composite Dry Structure Cost Improvement Approach

    NASA Technical Reports Server (NTRS)

    Nettles, Alan; Nettles, Mindy

    2015-01-01

    This effort demonstrates that by focusing only on properties of relevance, composite interstage and shroud structures can be placed on the Space Launch System vehicle that simultaneously reduces cost, improves reliability, and maximizes performance, thus providing the Advanced Development Group with a new methodology of how to utilize composites to reduce weight for composite structures on launch vehicles. Interstage and shroud structures were chosen since both of these structures are simple in configuration and do not experience extreme environments (such as cryogenic or hot gas temperatures) and should represent a good starting point for flying composites on a 'man-rated' vehicle. They are used as an example only. The project involves using polymer matrix composites for launch vehicle structures, and the logic and rationale behind the proposed new methodology.

  10. Role of scientific data in health decisions.

    PubMed Central

    Samuels, S W

    1979-01-01

    The distinction between reality and models or methodological assumptions is necessary for an understanding of the use of data--economic, technical or biological--in decision-making. The traditional modes of analysis used in decisions are discussed historically and analytically. Utilitarian-based concepts such as cost-benefit analysis and cannibalistic concepts such as "acceptable risk" are rejected on logical and moral grounds. Historical reality suggests the concept of socially necessary risk determined through the dialectic process in democracy. PMID:120251

  11. Prediction of Compressional, Shear, and Stoneley Wave Velocities from Conventional Well Log Data Using a Committee Machine with Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2012-01-01

    Measurement of compressional, shear, and Stoneley wave velocities, carried out by dipole sonic imager (DSI) logs, provides invaluable data in geophysical interpretation, geomechanical studies and hydrocarbon reservoir characterization. The presented study proposes an improved methodology for making a quantitative formulation between conventional well logs and sonic wave velocities. First, sonic wave velocities were predicted from conventional well logs using artificial neural network, fuzzy logic, and neuro-fuzzy algorithms. Subsequently, a committee machine with intelligent systems was constructed by virtue of hybrid genetic algorithm-pattern search technique while outputs of artificial neural network, fuzzy logic and neuro-fuzzy models were used as inputs of the committee machine. It is capable of improving the accuracy of final prediction through integrating the outputs of aforementioned intelligent systems. The hybrid genetic algorithm-pattern search tool, embodied in the structure of committee machine, assigns a weight factor to each individual intelligent system, indicating its involvement in overall prediction of DSI parameters. This methodology was implemented in Asmari formation, which is the major carbonate reservoir rock of Iranian oil field. A group of 1,640 data points was used to construct the intelligent model, and a group of 800 data points was employed to assess the reliability of the proposed model. The results showed that the committee machine with intelligent systems performed more effectively compared with individual intelligent systems performing alone.

  12. The origin of life and its methodological challenge.

    PubMed

    Wächtershäuser, G

    1997-08-21

    The problem of the origin of life is discussed from a methodological point of view as an encounter between the teleological thinking of the historian and the mechanistic thinking of the chemist; and as the Kantian task of replacing teleology by mechanism. It is shown how the Popperian situational logic of historic understanding and the Popperian principle of explanatory power of scientific theories, when jointly applied to biochemistry, lead to a methodology of biochemical retrodiction, whereby common precursor functions are constructed for disparate successor functions. This methodology is exemplified by central tenets of the theory of the chemo-autotrophic origin of life: the proposal of a surface metabolism with a two-dimensional order; the basic polarity of life with negatively charged constituents on positively charged mineral surfaces; the surface-metabolic origin of phosphorylated sugar metabolism and nucleic acids; the origin of membrane lipids and of chemi-osmosis on pyrite surfaces; and the principles of the origin of the genetic machinery. The theory presents the early evolution of life as a process that begins with chemical necessity and winds up in genetic chance.

  13. A Sizing Methodology for the Conceptual Design of Blended-Wing-Body Transports. Degree awarded by George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Bradley, Kevin R.

    2004-01-01

    This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.

  14. The logic-bias effect: The role of effortful processing in the resolution of belief-logic conflict.

    PubMed

    Howarth, Stephanie; Handley, Simon J; Walsh, Clare

    2016-02-01

    According to the default interventionist dual-process account of reasoning, belief-based responses to reasoning tasks are based on Type 1 processes generated by default, which must be inhibited in order to produce an effortful, Type 2 output based on the validity of an argument. However, recent research has indicated that reasoning on the basis of beliefs may not be as fast and automatic as this account claims. In three experiments, we presented participants with a reasoning task that was to be completed while they were generating random numbers (RNG). We used the novel methodology introduced by Handley, Newstead & Trippas (Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 28-43, 2011), which required participants to make judgments based upon either the validity of a conditional argument or the believability of its conclusion. The results showed that belief-based judgments produced lower rates of accuracy overall and were influenced to a greater extent than validity judgments by the presence of a conflict between belief and logic for both simple and complex arguments. These findings were replicated in Experiment 3, in which we controlled for switching demands in a blocked design. Across all three experiments, we found a main effect of RNG, implying that both instructional sets require some effortful processing. However, in the blocked design RNG had its greatest impact on logic judgments, suggesting that distinct executive resources may be required for each type of judgment. We discuss the implications of our findings for the default interventionist account and offer a parallel competitive model as an alternative interpretation for our findings.

  15. Design and simulation of programmable relational optoelectronic time-pulse coded processors as base elements for sorting neural networks

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2010-05-01

    In the paper we show that the biologically motivated conception of time-pulse encoding usage gives a set of advantages (single methodological basis, universality, tuning simplicity, learning and programming et al) at creation and design of sensor systems with parallel input-output and processing for 2D structures hybrid and next generations neuro-fuzzy neurocomputers. We show design principles of programmable relational optoelectronic time-pulse encoded processors on the base of continuous logic, order logic and temporal waves processes. We consider a structure that execute analog signal extraction, analog and time-pulse coded variables sorting. We offer optoelectronic realization of such base relational order logic element, that consists of time-pulse coded photoconverters (pulse-width and pulse-phase modulators) with direct and complementary outputs, sorting network on logical elements and programmable commutation blocks. We make technical parameters estimations of devices and processors on such base elements by simulation and experimental research: optical input signals power 0.2 - 20 uW, processing time 1 - 10 us, supply voltage 1 - 3 V, consumption power 10 - 100 uW, extended functional possibilities, learning possibilities. We discuss some aspects of possible rules and principles of learning and programmable tuning on required function, relational operation and realization of hardware blocks for modifications of such processors. We show that it is possible to create sorting machines, neural networks and hybrid data-processing systems with untraditional numerical systems and pictures operands on the basis of such quasiuniversal hardware simple blocks with flexible programmable tuning.

  16. Where Are the Logical Errors in the Theory of Big Bang?

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.

  17. Training Signaling Pathway Maps to Biochemical Data with Constrained Fuzzy Logic: Quantitative Analysis of Liver Cell Responses to Inflammatory Stimuli

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.

    2011-01-01

    Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212

  18. Knowledge elicitation for an operator assistant system in process control tasks

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.

    1988-01-01

    A knowledge based system (KBS) methodology designed to study human machine interactions and levels of autonomy in allocation of process control tasks is presented. Users are provided with operation manuals to assist them in normal and abnormal situations. Unfortunately, operation manuals usually represent only the functioning logic of the system to be controlled. The user logic is often totally different. A method is focused on which illicits user logic to refine a KBS shell called an Operator Assistant (OA). If the OA is to help the user, it is necessary to know what level of autonomy gives the optimal performance of the overall man-machine system. For example, for diagnoses that must be carried out carefully by both the user and the OA, interactions are frequent, and processing is mostly sequential. Other diagnoses can be automated, in which the case the OA must be able to explain its reasoning in an appropriate level of detail. OA structure was used to design a working KBS called HORSES (Human Orbital Refueling System Expert System). Protocol analysis of pilots interacting with this system reveals that the a-priori analytical knowledge becomes more structured with training and the situation patterns more complex and dynamic. This approach can improve the a-priori understanding of human and automatic reasoning.

  19. Fuzzy logic-based assessment for mapping potential infiltration areas in low-gradient watersheds.

    PubMed

    Quiroz Londoño, Orlando Mauricio; Romanelli, Asunción; Lima, María Lourdes; Massone, Héctor Enrique; Martínez, Daniel Emilio

    2016-07-01

    This paper gives an account of the design a logic-based approach for identifying potential infiltration areas in low-gradient watersheds based on remote sensing data. This methodological framework is applied in a sector of the Pampa Plain, Argentina, which has high level of agricultural activities and large demands for groundwater supplies. Potential infiltration sites are assessed as a function of two primary topics: hydrologic and soil conditions. This model shows the state of each evaluated subwatershed respecting to its potential contribution to infiltration mainly based on easily measurable and commonly used parameters: drainage density, geomorphologic units, soil media, land-cover, slope and aspect (slope orientation). Mapped outputs from the logic model displayed 42% very low-low, 16% moderate, 41% high-very high contribution to potential infiltration in the whole watershed. Subwatersheds in the upper and lower section were identified as areas with high to very high potential infiltration according to the following media features: low drainage density (<1.5 km/km(2)), arable land and pastures as the main land-cover categories, sandy clay loam to loam - clay loam soils and with the geomorphological units named poorly drained plain, channelized drainage plain and, dunes and beaches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Critical Analysis of the Mathematical Formalism of Theoretical Physics. V. Foundations of the Theory of Negative Numbers

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    Analysis of the foundations of the theory of negative numbers is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. Statement of the problem is as follows. As is known, point O in the Cartesian coordinate system XOY determines the position of zero on the scale. The number ``zero'' belongs to both the scale of positive numbers and the scale of negative numbers. In this case, the following formallogical contradiction arises: the number 0 is both positive number and negative number; or, equivalently, the number 0 is neither positive number nor negative number, i.e. number 0 has no sign. Then the following question arises: Do negative numbers exist in science and practice? A detailed analysis of the problem shows that negative numbers do not exist because the foundations of the theory of negative numbers contrary to the formal-logical laws. It is proved that: (a) all numbers have no signs; (b) the concepts ``negative number'' and ``negative sign of number'' represent a formallogical error; (c) signs ``plus'' and ``minus'' are only symbols of mathematical operations. The logical errors determine the essence of the theory of negative numbers: the theory of negative number is a false theory.

  1. Healthcare quality improvement work: a professional employee perspective.

    PubMed

    Gadolin, Christian; Andersson, Thomas

    2017-06-12

    Purpose The purpose of this paper is to describe and analyze conditions that influence how employees engage in healthcare quality improvement (QI) work. Design/methodology/approach Qualitative case study based on interviews ( n=27) and observations ( n=10). Findings The main conditions that influence how employees engage in healthcare QI work are professions, work structures and working relationships. These conditions can both prevent and facilitate healthcare QI. Professions and work structures may cement existing institutional logics and thus prevent employees from engaging in healthcare QI work. However, attempts to align QI with professional logics, together with work structures that empower employees, can make these conditions increase employee engagement, which can be accomplished through positive working relationships that foster institutional work, which bridge different competing institutional logics, making it possible to overcome barriers that professions and work structures may constitute. Practical implications Understanding the conditions that influence how employees engage in healthcare QI work will make initiatives more likely to succeed. Originality/value Healthcare QI has mainly been studied from an implementer perspective, and employees have either been neglected or seen as passive resisters. Weak employee perspectives make healthcare QI research incomplete. In our research, healthcare QI work is studied closely at the actor level to understand healthcare QI from an employee perspective.

  2. Incorporating Virtual Reactions into a Logic-based Ligand-based Virtual Screening Method to Discover New Leads

    PubMed Central

    Reynolds, Christopher R; Muggleton, Stephen H; Sternberg, Michael J E

    2015-01-01

    The use of virtual screening has become increasingly central to the drug development pipeline, with ligand-based virtual screening used to screen databases of compounds to predict their bioactivity against a target. These databases can only represent a small fraction of chemical space, and this paper describes a method of exploring synthetic space by applying virtual reactions to promising compounds within a database, and generating focussed libraries of predicted derivatives. A ligand-based virtual screening tool Investigational Novel Drug Discovery by Example (INDDEx) is used as the basis for a system of virtual reactions. The use of virtual reactions is estimated to open up a potential space of 1.21×1012 potential molecules. A de novo design algorithm known as Partial Logical-Rule Reactant Selection (PLoRRS) is introduced and incorporated into the INDDEx methodology. PLoRRS uses logical rules from the INDDEx model to select reactants for the de novo generation of potentially active products. The PLoRRS method is found to increase significantly the likelihood of retrieving molecules similar to known actives with a p-value of 0.016. Case studies demonstrate that the virtual reactions produce molecules highly similar to known actives, including known blockbuster drugs. PMID:26583052

  3. Multi-Criteria Decision-Making Methods and Their Applications for Human Resources

    NASA Astrophysics Data System (ADS)

    D'Urso, M. G.; Masi, D.

    2015-05-01

    Both within the formation field and the labor market Multi-Criteria Decision Methods (MCDM) provide a significant support to the management of human resources in which the best choice among several alternatives can be very complex. This contribution addresses fuzzy logic in multi-criteria decision techniques since they have several applications in the management of human resources with the advantage of ruling out mistakes due to the subjectivity of the person in charge of making a choice. Evaluating educational achievements as well as the professional profile of a technician more suitable for a job in a firm, industry or a professional office are valuable examples of fuzzy logic. For all of the previous issues subjectivity is a fundamental aspect so that fuzzy logic, due to the very meaning of the word fuzzy, should be the preferred choice. However, this is not sufficient to justify its use; fuzzy technique has to make the system of evaluation and choice more effective and objective. The methodological structure of the multi-criteria fuzzy criterion is hierarchic and allows one to select the best alternatives in all those cases in which several alternatives are possible; thus, the optimal choice can be achieved by analyzing the different scopes of each criterion and sub-criterion as well as the relevant weights.

  4. On the Formal-Logical Analysis of the Foundations of Mathematics Applied to Problems in Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2016-03-01

    Analysis of the foundations of mathematics applied to problems in physics was proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is shown that critical analysis of the concept of mathematical quantity - central concept of mathematics - leads to the following conclusion: (1) The concept of ``mathematical quantity'' is the result of the following mental operations: (a) abstraction of the ``quantitative determinacy of physical quantity'' from the ``physical quantity'' at that the ``quantitative determinacy of physical quantity'' is an independent object of thought; (b) abstraction of the ``amount (i.e., abstract number)'' from the ``quantitative determinacy of physical quantity'' at that the ``amount (i.e., abstract number)'' is an independent object of thought. In this case, unnamed, abstract numbers are the only sign of the ``mathematical quantity''. This sign is not an essential sign of the material objects. (2) The concept of mathematical quantity is meaningless, erroneous, and inadmissible concept in science because it represents the following formal-logical and dialectical-materialistic error: negation of the existence of the essential sign of the concept (i.e., negation of the existence of the essence of the concept) and negation of the existence of measure of material object.

  5. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    PubMed

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  6. Characterization of the faulted behavior of digital computers and fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Miner, Paul S.

    1989-01-01

    A development status evaluation is presented for efforts conducted at NASA-Langley since 1977, toward the characterization of the latent fault in digital fault-tolerant systems. Attention is given to the practical, high speed, generalized gate-level logic system simulator developed, as well as to the validation methodology used for the simulator, on the basis of faultable software and hardware simulations employing a prototype MIL-STD-1750A processor. After validation, latency tests will be performed.

  7. Unmanned Surface Sea Vehicle Power System Design and Modeling

    DTIC Science & Technology

    2005-11-29

    Singh, C.J. Fennie , Jr., A.J. Salkind, and D.E. Reisner, "A Fuzzy Logic Methodology to Determine State-of-Charge (SOC) in Electric and Hybrid Vehicle...Systems", 16th IEEE Photovoltaic same length of 10 meters. Specialists Conference, pp. 513-518, 1982. [5] Pritpal Singh, Craig J. Fennie , Jr., Alvin J...34Estimation of Battery Charge in Photovoltaic Systems", 16th IEEE Photovoltaic Specialists Conference, pp. 513-518, 1982. [5] Pritpal Singh, Craig J. Fennie , Jr

  8. ISITE: Automatic Circuit Synthesis for Double-Metal CMOS VLSI (Very Large Scale Integrated) Circuits

    DTIC Science & Technology

    1989-12-01

    rows and columns should be minimized. There are two methodologies for achieving this objective, namely, logic minimization to I I I 15 I A B C D E T...type and N-type polysilicon (Figure 2.5( b )) and interconnecting the gates with metal at a later I processing step. The two layers of aluminum available...polysiliconI ...... .. ... .. .. . .. ... .. ... .. I N polysilicon Iii~~iiiiiiii~~iiiiii (a) ( b ) 3 Figure 2.5. Controlling the Threshold Voltage in

  9. DLA Systems Modernization Methodology: Logical Analysis and Design Procedures

    DTIC Science & Technology

    1990-07-01

    Information Requirement would have little meaning and thus would lose its value . 3 I3 I 1.1.3 INPUT PRODUCTS 3 1.1.3.1 Enterprise Model Objective List 1.1.3.2...at the same time, the attribute is said to be multi- valued . i For example, an E-R model may contain information on the languages an employee speaks...Relationship model is examined in detail to ensure that each data group contains attributes whose values are absolutely determined by their respective

  10. Methodological challenges in qualitative content analysis: A discussion paper.

    PubMed

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. A methodology for double patterning compliant split and design

    NASA Astrophysics Data System (ADS)

    Wiaux, Vincent; Verhaegen, Staf; Iwamoto, Fumio; Maenhoudt, Mireille; Matsuda, Takashi; Postnikov, Sergei; Vandenberghe, Geert

    2008-11-01

    Double Patterning allows to further extend the use of water immersion lithography at its maximum numerical aperture NA=1.35. Splitting of design layers to recombine through Double Patterning (DP) enables an effective resolution enhancement. Single polygons may need to be split up (cut) depending on the pattern density and its 2D content. The split polygons recombine at the so-called 'stitching points'. These stitching points may affect the yield due to the sensitivity to process variations. We describe a methodology to ensure a robust double patterning by identifying proper split- and design- guidelines. Using simulations and experimental data, we discuss in particular metal1 first interconnect layers of random LOGIC and DRAM applications at 45nm half-pitch (hp) and 32nm hp where DP may become the only timely patterning solution.

  12. Philosophical Approaches towards Sciences of Life in Early Cybernetics

    NASA Astrophysics Data System (ADS)

    Montagnini, Leone

    2008-07-01

    The article focuses on the different conceptual and philosophical approaches towards the sciences of life operating in the backstage of Early Cybernetics. After a short reconstruction of the main steps characterizing the origins of Cybernetics, from 1940 until 1948, the paper examines the complementary conceptual views between Norbert Wiener and John von Neumann, as a "fuzzy thinking" versus a "logical thinking", and the marked difference between the "methodological individualism" shared by both of them versus the "methodological collectivism" of most of the numerous scientists of life and society attending the Macy Conferences on Cybernetics. The main thesis sustained here is that these different approaches, quite invisible to the participants, were different, maybe even opposite, but they could provoke clashes, as well as cooperate in a synergic way.

  13. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2018-07-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  14. Methodology for Prototyping Increased Levels of Automation for Spacecraft Rendezvous Functions

    NASA Technical Reports Server (NTRS)

    Hart, Jeremy J.; Valasek, John

    2007-01-01

    The Crew Exploration Vehicle necessitates higher levels of automation than previous NASA vehicles, due to program requirements for automation, including Automated Rendezvous and Docking. Studies of spacecraft development often point to the locus of decision-making authority between humans and computers (i.e. automation) as a prime driver for cost, safety, and mission success. Therefore, a critical component in the Crew Exploration Vehicle development is the determination of the correct level of automation. To identify the appropriate levels of automation and autonomy to design into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool. This paper develops a methodology for prototyping increased levels of automation for spacecraft rendezvous functions. This methodology is used to evaluate the accuracy of the Function-specific Level of Autonomy and Automation Tool specified levels of automation, via prototyping. Spacecraft rendezvous planning tasks are selected and then prototyped in Matlab using Fuzzy Logic techniques and existing Space Shuttle rendezvous trajectory algorithms.

  15. Legitimation dynamics: How SROI could mobilize resources for new constituencies.

    PubMed

    Cooney, Kate

    2017-10-01

    The following critical essay on the social return on investment (SROI) methodology is broken into two parts. In the first section, focusing on the categorization dynamics of the SROI, I review a set of methodological and ethical tensions surrounding the SROI, using examples from my own work and other published works using SROI. These tensions include the fact that the project requires standardization to achieve comparability while concurrently offering a flexibility in constructing a narrative of impact that is attractive to users. In the second section, focusing on the legitimation dynamics, I define a narrow scope for where, despite the aforementioned pitfalls, that the SROI can be quite effective in building a rhetorical argument for directing material resources. The essay argues that despite ongoing methodological challenges, the investor lens and market logic undergirding the metric provide a powerful frame for persuasion that can be used to construct worthiness and value creation for constituents not already constructed as such. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2017-10-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  17. Commentary: “An Evaluation of Universal Grammar and the Phonological Mind”—UG Is Still a Viable Hypothesis

    PubMed Central

    Berent, Iris

    2016-01-01

    Everett (2016b) criticizes The Phonological Mind thesis (Berent, 2013a,b) on logical, methodological and empirical grounds. Most of Everett’s concerns are directed toward the hypothesis that the phonological grammar is constrained by universal grammatical (UG) principles. Contrary to Everett’s logical challenges, here I show that the UG hypothesis is readily falsifiable, that universality is not inconsistent with innateness (Everett’s arguments to the contrary are rooted in a basic confusion of the UG phenotype and the genotype), and that its empirical evaluation does not require a full evolutionary account of language. A detailed analysis of one case study, the syllable hierarchy, presents a specific demonstration that people have knowledge of putatively universal principles that are unattested in their language and these principles are most likely linguistic in nature. Whether Universal Grammar exists remains unknown, but Everett’s arguments hardly undermine the viability of this hypothesis. PMID:27471480

  18. On the Correct Formulation of the Law of the External Photoelectric Effect

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2017-01-01

    The critical and correct scientific analysis of the generally accepted theory of the external photoelectric effect is proposed. The methodological basis for the analysis is the unity of formal logic and of rational dialectics. It is shown that Einstein's formulation of the law of the photoelectric effect is not free from the following objection. The terms of Einstein's formula characterize the quantitative determinacy (i.e., energy) which belongs and is related to the different material objects: ``photon'', ``electron in metal'', and ``electron not in metal''. This signifies that Einstein's formula represents violation of the formal-logical laws of identity and absence (lack) of contradiction. The correct mathematical formulation of the law of the external photoelectric effect within the framework of the system approach is proposed. The correct formulation represents the proportion by relative increments of the energy of the incident photon and the energy of the emitted electron. The proportion describes the linear relationship between the energy of the incident photon and the energy of the emitted electron.

  19. Expanding a First-Order Logic Mitigation Framework to Handle Multimorbid Patient Preferences

    PubMed Central

    Michalowski, Martin; Wilk, Szymon; Rosu, Daniela; Kezadri, Mounira; Michalowski, Wojtek; Carrier, Marc

    2015-01-01

    The increasing prevalence of multimorbidity is a challenge for physicians who have to manage a constantly growing number of patients with simultaneous diseases. Adding to this challenge is the need to incorporate patient preferences as key components of the care process, thanks in part to the emergence of personalized and participatory medicine. In our previous work we proposed a framework employing first order logic to represent clinical practice guidelines (CPGs) and to mitigate possible adverse interactions when concurrently applying multiple CPGs to a multimorbid patient. In this paper, we describe extensions to our methodological framework that (1) broaden our definition of revision operators to support required and desired types of revisions defined in secondary knowledge sources, and (2) expand the mitigation algorithm to apply revisions based on their type. We illustrate the capabilities of the expanded framework using a clinical case study of a multimorbid patient with stable cardiac artery disease who suffers a sudden onset of deep vein thrombosis. PMID:26958226

  20. Fault detection and accommodation testing on an F100 engine in an F-15 airplane

    NASA Technical Reports Server (NTRS)

    Myers, L. P.; Baer-Riedhart, J. L.; Maxwell, M. D.

    1985-01-01

    The fault detection and accommodation (FDA) methodology for digital engine-control systems may range from simple comparisons of redundant parameters to the more complex and sophisticated observer models of the entire engine system. Evaluations of the various FDA schemes are done using analytical methods, simulation, and limited-altitude-facility testing. Flight testing of the FDA logic has been minimal because of the difficulty of inducing realistic faults in flight. A flight program was conducted to evaluate the fault detection and accommodation capability of a digital electronic engine control in an F-15 aircraft. The objective of the flight program was to induce selected faults and evaluate the resulting actions of the digital engine controller. Comparisons were made between the flight results and predictions. Several anomalies were found in flight and during the ground test. Simulation results showed that the inducement of dual pressure failures was not feasible since the FDA logic was not designed to accommodate these types of failures.

  1. Community science, philosophy of science, and the practice of research.

    PubMed

    Tebes, Jacob Kraemer

    2005-06-01

    Embedded in community science are implicit theories on the nature of reality (ontology), the justification of knowledge claims (epistemology), and how knowledge is constructed (methodology). These implicit theories influence the conceptualization and practice of research, and open up or constrain its possibilities. The purpose of this paper is to make some of these theories explicit, trace their intellectual history, and propose a shift in the way research in the social and behavioral sciences, and community science in particular, is conceptualized and practiced. After describing the influence and decline of logical empiricism, the underlying philosophical framework for science for the past century, I summarize contemporary views in the philosophy of science that are alternatives to logical empiricism. These include contextualism, normative naturalism, and scientific realism, and propose that a modified version of contextualism, known as perspectivism, affords the philosophical framework for an emerging community science. I then discuss the implications of perspectivism for community science in the form of four propositions to guide the practice of research.

  2. [Two traditions in the scientific learning of the world. A case study of creation and reception of quantum mechanics over the period 1925-1927, on the bases of discussion between Werner Heisenberg and Albert Einstein].

    PubMed

    Krajniak, Wiktor

    2014-01-01

    The purpose of this article is the analyses of discussion between Albert Einstein and Werner Heisenberg in the period 1925-1927. Their disputes, relating to the sources of scientific knowledge, its methods and the value of knowledge acquired in this way, are part of the characteristic for the European science discourse between rationalism and empirism. On the basis of some sources and literature on the subject, the epistemological positions of both scholars in the period were reconstructed. This episode, yet poorly known, is a unique example of scientific disputes, whose range covers a broad spectrum of methodological problems associated with the historical development of science. The conducted analysis sheds some light on the source of popularity of logical empirism in the first half of the 20th century. A particular emphasis is placed on the impact of the neopositivist ideas which reflect Heisenberg's research program, being the starting point for the Copenhagen interpretation of quantum mechanics. The main assumption of logical empirism, concerning acquisition of scientific knowledge only by means of empirical procedures and logical analysis of the language of science, in view of the voiced by Einstein arguments, bears little relationship with actual testing practices in the historical aspect of the development of science. The criticism of Heisenberg's program, carried out by Einstein, provided arguments for the main critics of the neopositivist ideal and contributed to the bankruptcy of the idea of logical empirism, thereby starting a period of critical rationalism prosperity, arising from criticism of neopositivism and alluding to Einstein's ideas.

  3. Programmed optoelectronic time-pulse coded relational processor as base element for sorting neural networks

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Bardachenko, Vitaliy F.; Nikolsky, Alexander I.; Lazarev, Alexander A.

    2007-04-01

    In the paper we show that the biologically motivated conception of the use of time-pulse encoding gives the row of advantages (single methodological basis, universality, simplicity of tuning, training and programming et al) at creation and designing of sensor systems with parallel input-output and processing, 2D-structures of hybrid and neuro-fuzzy neurocomputers of next generations. We show principles of construction of programmable relational optoelectronic time-pulse coded processors, continuous logic, order logic and temporal waves processes, that lie in basis of the creation. We consider structure that executes extraction of analog signal of the set grade (order), sorting of analog and time-pulse coded variables. We offer optoelectronic realization of such base relational elements of order logic, which consists of time-pulse coded phototransformers (pulse-width and pulse-phase modulators) with direct and complementary outputs, sorting network on logical elements and programmable commutations blocks. We make estimations of basic technical parameters of such base devices and processors on their basis by simulation and experimental research: power of optical input signals - 0.200-20 μW, processing time - microseconds, supply voltage - 1.5-10 V, consumption power - hundreds of microwatts per element, extended functional possibilities, training possibilities. We discuss some aspects of possible rules and principles of training and programmable tuning on the required function, relational operation and realization of hardware blocks for modifications of such processors. We show as on the basis of such quasiuniversal hardware simple block and flexible programmable tuning it is possible to create sorting machines, neural networks and hybrid data-processing systems with the untraditional numerical systems and pictures operands.

  4. Intelligent Machines in the 21st Century: Automating the Processes of Inference and Inquiry

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2003-01-01

    The last century saw the application of Boolean algebra toward the construction of computing machines, which work by applying logical transformations to information contained in their memory. The development of information theory and the generalization of Boolean algebra to Bayesian inference have enabled these computing machines. in the last quarter of the twentieth century, to be endowed with the ability to learn by making inferences from data. This revolution is just beginning as new computational techniques continue to make difficult problems more accessible. However, modern intelligent machines work by inferring knowledge using only their pre-programmed prior knowledge and the data provided. They lack the ability to ask questions, or request data that would aid their inferences. Recent advances in understanding the foundations of probability theory have revealed implications for areas other than logic. Of relevance to intelligent machines, we identified the algebra of questions as the free distributive algebra, which now allows us to work with questions in a way analogous to that which Boolean algebra enables us to work with logical statements. In this paper we describe this logic of inference and inquiry using the mathematics of partially ordered sets and the scaffolding of lattice theory, discuss the far-reaching implications of the methodology, and demonstrate its application with current examples in machine learning. Automation of both inference and inquiry promises to allow robots to perform science in the far reaches of our solar system and in other star systems by enabling them to not only make inferences from data, but also decide which question to ask, experiment to perform, or measurement to take given what they have learned and what they are designed to understand.

  5. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    PubMed

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.

  6. Risk assessment techniques with applicability in marine engineering

    NASA Astrophysics Data System (ADS)

    Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.

    2015-11-01

    Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.

  7. The Use of a Predictive Habitat Model and a Fuzzy Logic Approach for Marine Management and Planning

    PubMed Central

    Hattab, Tarek; Ben Rais Lasram, Frida; Albouy, Camille; Sammari, Chérif; Romdhane, Mohamed Salah; Cury, Philippe; Leprieur, Fabien; Le Loc’h, François

    2013-01-01

    Bottom trawl survey data are commonly used as a sampling technique to assess the spatial distribution of commercial species. However, this sampling technique does not always correctly detect a species even when it is present, and this can create significant limitations when fitting species distribution models. In this study, we aim to test the relevance of a mixed methodological approach that combines presence-only and presence-absence distribution models. We illustrate this approach using bottom trawl survey data to model the spatial distributions of 27 commercially targeted marine species. We use an environmentally- and geographically-weighted method to simulate pseudo-absence data. The species distributions are modelled using regression kriging, a technique that explicitly incorporates spatial dependence into predictions. Model outputs are then used to identify areas that met the conservation targets for the deployment of artificial anti-trawling reefs. To achieve this, we propose the use of a fuzzy logic framework that accounts for the uncertainty associated with different model predictions. For each species, the predictive accuracy of the model is classified as ‘high’. A better result is observed when a large number of occurrences are used to develop the model. The map resulting from the fuzzy overlay shows that three main areas have a high level of agreement with the conservation criteria. These results align with expert opinion, confirming the relevance of the proposed methodology in this study. PMID:24146867

  8. Combining geographic information system, multicriteria evaluation techniques and fuzzy logic in siting MSW landfills

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Tsihrintzis, Vassilios A.; Voudrias, Evangelos; Petalas, Christos; Stravodimos, George

    2007-01-01

    This study presents a methodology for siting municipal solid waste landfills, coupling geographic information systems (GIS), fuzzy logic, and multicriteria evaluation techniques. Both exclusionary and non-exclusionary criteria are used. Factors, i.e., non-exclusionary criteria, are divided in two distinct groups which do not have the same level of trade off. The first group comprises factors related to the physical environment, which cannot be expressed in terms of monetary cost and, therefore, they do not easily trade off. The second group includes those factors related to human activities, i.e., socioeconomic factors, which can be expressed as financial cost, thus showing a high level of trade off. GIS are used for geographic data acquisition and processing. The analytical hierarchy process (AHP) is the multicriteria evaluation technique used, enhanced with fuzzy factor standardization. Besides assigning weights to factors through the AHP, control over the level of risk and trade off in the siting process is achieved through a second set of weights, i.e., order weights, applied to factors in each factor group, on a pixel-by-pixel basis, thus taking into account the local site characteristics. The method has been applied to Evros prefecture (NE Greece), an area of approximately 4,000 km2. The siting methodology results in two intermediate suitability maps, one related to environmental and the other to socioeconomic criteria. Combination of the two intermediate maps results in the final composite suitability map for landfill siting.

  9. Using SysML for MBSE analysis of the LSST system

    NASA Astrophysics Data System (ADS)

    Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques

    2010-07-01

    The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.

  10. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  11. European Healthy Cities evaluation: conceptual framework and methodology.

    PubMed

    de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola

    2015-06-01

    This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. [Evaluation of arguments in research reports].

    PubMed

    Botes, A

    1999-06-01

    Some authors on research methodology are of opinion that research reports are based on the logic of reasoning and that such reports communicate with the reader by presenting logical, coherent arguments (Böhme, 1975:206; Mouton, 1996:69). This view implies that researchers draw specific conclusions and that such conclusions are justified by way of reasoning (Doppelt, 1998:105; Giere, 1984:26; Harre, 1965:11; Leherer & Wagner, 1983 & Pitt, 1988:7). The structure of a research report thus consists mainly of conclusions and reasons for such conclusions (Booth, Colomb & Williams, 1995:97). From this it appears that justification by means of reasoning is a standard procedure in research and research reports. Despite the fact that the logic of research is based on reasoning, that the justification of research findings by way of reasoning appears to be standard procedure and that the structure of a research report comprises arguments, the evaluation or assessment of research, as described in most textbooks on research methodology (Burns & Grove, 1993:647; Creswell, 1994:193; LoBiondo-Wood & Haber, 1994:441/481) does not focus on the arguments of research. The evaluation criteria for research reports which are set in these textbooks are related to the way in which the research process is carried out and focus on the measures for internal, external, theoretical, measurement and inferential validity. This means that criteria for the evaluation of research are comprehensive and they should be very specific in respect of each type of research (for example quantitative or qualitative). When the evaluation of research reports is focused on arguments and logic, there could probably be one set of universal standards against which all types of human science research reports can be assessed. Such a universal set of standards could possibly simplify the evaluation of research reports in the human sciences since they can be used to assess all the critical aspects of research reports. As arguments from the basic structure of research reports and are probably also important in the evaluation of research reports in the human sciences, the following questions which I want to answer, are relevant to this paper namely: What are the standards which the reasoning in research reports in the human sciences should meet? How can research reports in the human sciences be assessed or evaluated according to these standards? In answering the first question, the logical demands that are made on reasoning in research are investigated. From these demands the acceptability of the statements, relevance and support of the premises to the conclusion are set as standards for reasoning in research. In answering the second question, a research article is used to demonstrate how the macro- and micro-arguments of research reports can be assessed or evaluated according to these standards. With evaluation it is indicated that the aspects of internal, external, theoretical, measurement and inferential validity can be evaluated according to these standards.

  13. Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.

    PubMed

    Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N

    2007-12-07

    A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.

  14. Methodology of remote sensing data interpretation and geological applications. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Veneziani, P.; Dosanjos, C. E.

    1982-01-01

    Elements of photointerpretation discussed include the analysis of photographic texture and structure as well as film tonality. The method used is based on conventional techniques developed for interpreting aerial black and white photographs. By defining the properties which characterize the form and individuality of dual images, homologous zones can be identified. Guy's logic method (1966) was adapted and used on functions of resolution, scale, and spectral characteristics of remotely sensed products. Applications of LANDSAT imagery are discussed for regional geological mapping, mineral exploration, hydrogeology, and geotechnical engineering in Brazil.

  15. Looped back fiber mode for reduction of false alarm in leak detection using distributed optical fiber sensor.

    PubMed

    Chelliah, Pandian; Murgesan, Kasinathan; Samvel, Sosamma; Chelamchala, Babu Rao; Tammana, Jayakumar; Nagarajan, Murali; Raj, Baldev

    2010-07-10

    Optical-fiber-based sensors have inherent advantages, such as immunity to electromagnetic interference, compared to the conventional sensors. Distributed optical fiber sensor (DOFS) systems, such as Raman and Brillouin distributed temperature sensors are used for leak detection. The inherent noise of fiber-based systems leads to occasional false alarms. In this paper, a methodology is proposed to overcome this. This uses a looped back fiber mode in DOFS and voting logic is employed to considerably reduce the false alarm rate.

  16. Research methodology in dentistry: Part II — The relevance of statistics in research

    PubMed Central

    Krithikadatta, Jogikalmat; Valarmathi, Srinivasan

    2012-01-01

    The lifeline of original research depends on adept statistical analysis. However, there have been reports of statistical misconduct in studies that could arise from the inadequate understanding of the fundamental of statistics. There have been several reports on this across medical and dental literature. This article aims at encouraging the reader to approach statistics from its logic rather than its theoretical perspective. The article also provides information on statistical misuse in the Journal of Conservative Dentistry between the years 2008 and 2011 PMID:22876003

  17. A tensor approach to modeling of nonhomogeneous nonlinear systems

    NASA Technical Reports Server (NTRS)

    Yurkovich, S.; Sain, M.

    1980-01-01

    Model following control methodology plays a key role in numerous application areas. Cases in point include flight control systems and gas turbine engine control systems. Typical uses of such a design strategy involve the determination of nonlinear models which generate requested control and response trajectories for various commands. Linear multivariable techniques provide trim about these motions; and protection logic is added to secure the hardware from excursions beyond the specification range. This paper reports upon experience in developing a general class of such nonlinear models based upon the idea of the algebraic tensor product.

  18. Waste certification program plan for Oak Ridge National Laboratory. Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1997-09-01

    This document defines the waste certification program (WCP) developed for implementation at Oak Ridge National Laboratory (ORNL). The document describes the program structure, logic, and methodology for certification of ORNL wastes. The purpose of the WCP is to provide assurance that wastes are properly characterized and that the Waste Acceptance Criteria (WAC) for receiving facilities are met. The program meets the waste certification requirements for mixed (both radioactive and hazardous) and hazardous [including polychlorinated biphenyls (PCB)] waste. Program activities will be conducted according to ORNL Level 1 document requirements.

  19. Advanced reliability modeling of fault-tolerant computer-based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1982-01-01

    Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.

  20. Mars Rover imaging systems and directional filtering

    NASA Technical Reports Server (NTRS)

    Wang, Paul P.

    1989-01-01

    Computer literature searches were carried out at Duke University and NASA Langley Research Center. The purpose is to enhance personal knowledge based on the technical problems of pattern recognition and image understanding which must be solved for the Mars Rover and Sample Return Mission. Intensive study effort of a large collection of relevant literature resulted in a compilation of all important documents in one place. Furthermore, the documents are being classified into: Mars Rover; computer vision (theory); imaging systems; pattern recognition methodologies; and other smart techniques (AI, neural networks, fuzzy logic, etc).

  1. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    USGS Publications Warehouse

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the approximate nature of the model and known limitations, means the applicability of UCERF3 should be evaluated on a case-by-case basis.

  2. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  3. Estimation of the daily global solar radiation based on the Gaussian process regression methodology in the Saharan climate

    NASA Astrophysics Data System (ADS)

    Guermoui, Mawloud; Gairaa, Kacem; Rabehi, Abdelaziz; Djafer, Djelloul; Benkaciali, Said

    2018-06-01

    Accurate estimation of solar radiation is the major concern in renewable energy applications. Over the past few years, a lot of machine learning paradigms have been proposed in order to improve the estimation performances, mostly based on artificial neural networks, fuzzy logic, support vector machine and adaptive neuro-fuzzy inference system. The aim of this work is the prediction of the daily global solar radiation, received on a horizontal surface through the Gaussian process regression (GPR) methodology. A case study of Ghardaïa region (Algeria) has been used in order to validate the above methodology. In fact, several combinations have been tested; it was found that, GPR-model based on sunshine duration, minimum air temperature and relative humidity gives the best results in term of mean absolute bias error (MBE), root mean square error (RMSE), relative mean square error (rRMSE), and correlation coefficient ( r) . The obtained values of these indicators are 0.67 MJ/m2, 1.15 MJ/m2, 5.2%, and 98.42%, respectively.

  4. Possibilities of the free-complement methodology for solving the Schrödinger equation of atoms and molecules

    NASA Astrophysics Data System (ADS)

    Nakatsuji, Hiroshi

    Chemistry is a science of complex subjects that occupy this universe and biological world and that are composed of atoms and molecules. Its essence is diversity. However, surprisingly, whole of this science is governed by simple quantum principles like the Schrödinger and the Dirac equations. Therefore, if we can find a useful general method of solving these quantum principles under the fermionic and/or bosonic constraints accurately in a reasonable speed, we can replace somewhat empirical methodologies of this science with purely quantum theoretical and computational logics. This is the purpose of our series of studies - called ``exact theory'' in our laboratory. Some of our documents are cited below. The key idea was expressed as the free complement (FC) theory (originally called ICI theory) that was introduced to solve the Schrödinger and Dirac equations analytically. For extending this methodology to larger systems, order N methodologies are essential, but actually the antisymmetry constraints for electronic wave functions become big constraints. Recently, we have shown that the antisymmetry rule or `dogma' can be very much relaxed when our subjects are large molecular systems. In this talk, I want to present our recent progress in our FC methodology. The purpose is to construct ``predictive quantum chemistry'' that is useful in chemical and physical researches and developments in institutes and industries

  5. [Multi-criteria decision analysis for health technology resource allocation and assessment: so far and so near?

    PubMed

    Campolina, Alessandro Gonçalves; Soárez, Patrícia Coelho De; Amaral, Fábio Vieira do; Abe, Jair Minoro

    2017-10-26

    Multi-criteria decision analysis (MCDA) is an emerging tool that allows the integration of relevant factors for health technology assessment (HTA). This study aims to present a summary of the methodological characteristics of MCDA: definitions, approaches, applications, and implementation stages. A case study was conducted in the São Paulo State Cancer Institute (ICESP) in order to understand the perspectives of decision-makers in the process of drafting a recommendation for the incorporation of technology in the Brazilian Unified National Health System (SUS), through a report by the Brazilian National Commission for the Incorporation of Technologies in the SUS (CONITEC). Paraconsistent annotated evidential logic Eτ was the methodological approach adopted in the study, since it can serve as an underlying logic for constructs capable of synthesizing objective information (from the scientific literature) and subjective information (from experts' values and preferences in the area of knowledge). It also allows the incorporation of conflicting information (contradictions), as well as vague and even incomplete information in the valuation process, resulting from imperfection of the available scientific evidence. The method has the advantages of allowing explicit consideration of the criteria that influenced the decision, facilitating follow-up and visualization of process stages, allowing assessment of the contribution of each criterion separately, and in aggregate, to the decision's outcome, facilitating the discussion of diverging perspectives by different stakeholder groups, and increasing the understanding of the resulting recommendations. The use of an explicit MCDA approach should facilitate conflict mediation and optimize participation by different stakeholder groups.

  6. A feasibility investigation for modeling and optimization of temperature in bone drilling using fuzzy logic and Taguchi optimization methodology.

    PubMed

    Pandey, Rupesh Kumar; Panda, Sudhansu Sekhar

    2014-11-01

    Drilling of bone is a common procedure in orthopedic surgery to produce hole for screw insertion to fixate the fracture devices and implants. The increase in temperature during such a procedure increases the chances of thermal invasion of bone which can cause thermal osteonecrosis resulting in the increase of healing time or reduction in the stability and strength of the fixation. Therefore, drilling of bone with minimum temperature is a major challenge for orthopedic fracture treatment. This investigation discusses the use of fuzzy logic and Taguchi methodology for predicting and minimizing the temperature produced during bone drilling. The drilling experiments have been conducted on bovine bone using Taguchi's L25 experimental design. A fuzzy model is developed for predicting the temperature during orthopedic drilling as a function of the drilling process parameters (point angle, helix angle, feed rate and cutting speed). Optimum bone drilling process parameters for minimizing the temperature are determined using Taguchi method. The effect of individual cutting parameters on the temperature produced is evaluated using analysis of variance. The fuzzy model using triangular and trapezoidal membership predicts the temperature within a maximum error of ±7%. Taguchi analysis of the obtained results determined the optimal drilling conditions for minimizing the temperature as A3B5C1.The developed system will simplify the tedious task of modeling and determination of the optimal process parameters to minimize the bone drilling temperature. It will reduce the risk of thermal osteonecrosis and can be very effective for the online condition monitoring of the process. © IMechE 2014.

  7. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  8. Computational methodology to predict satellite system-level effects from impacts of untrackable space debris

    NASA Astrophysics Data System (ADS)

    Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.

    2013-07-01

    This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.

  9. Electromagnetic interference of cardiac rhythmic monitoring devices to radio frequency identification: analytical analysis and mitigation methodology.

    PubMed

    Ogirala, Ajay; Stachel, Joshua R; Mickle, Marlin H

    2011-11-01

    Increasing density of wireless communication and development of radio frequency identification (RFID) technology in particular have increased the susceptibility of patients equipped with cardiac rhythmic monitoring devices (CRMD) to environmental electro magnetic interference (EMI). Several organizations reported observing CRMD EMI from different sources. This paper focuses on mathematically analyzing the energy as perceived by the implanted device, i.e., voltage. Radio frequency (RF) energy transmitted by RFID interrogators is considered as an example. A simplified front-end equivalent circuit of a CRMD sensing circuitry is proposed for the analysis following extensive black-box testing of several commercial pacemakers and implantable defibrillators. After careful understanding of the mechanics of the CRMD signal processing in identifying the QRS complex of the heart-beat, a mitigation technique is proposed. The mitigation methodology introduced in this paper is logical in approach, simple to implement and is therefore applicable to all wireless communication protocols.

  10. [Demonstrating patient safety requires acceptance of a broader scientific palette].

    PubMed

    Leistikow, I

    2017-01-01

    It is high time the medical community recognised that patient-safety research can be assessed using other scientific methods than the traditional medical ones. There is often a fundamental mismatch between the methodology of patient-safety research and the methodology used to assess the quality of this research. One example is research into the reliability and validity of record review as a method for detecting adverse events. This type of research is based on logical positivism, while record review itself is based on social constructivism. Record review does not lead to "one truth": adverse events are not measured on the basis of the records themselves, but by weighing the probability of certain situations being classifiable as adverse events. Healthcare should welcome behavioural and social sciences to its scientific palette. Restricting ourselves to the randomised control trial paradigm is short-sighted and dangerous; it deprives patients of much-needed improvements in safety.

  11. [The diversity of science in Carnap's, Lewin's and Fleck's philosophy. The development of a pluralistic scientific concept].

    PubMed

    Köchy, Kristian

    2010-03-01

    In the 1920s and 1930s three different but simultaneous approaches of philosophy of science can be distinguished: the logical approach of the physicist Rudolf Carnap, the logico-historical approach of the psychologist Kurt Lewin and the socio-historical approach of the medical scientist Ludwik Fleck. While the philosophies of Lewin and Fleck can be characterized as contextual appraisals which account for the interactions between particular sciences and their historical, socio-cultural or intellectual environments, Carnap's philosohy is narrowed to an internal methodology centered on scientific propositions and ogical structures in general. In addition to these differences in aim and practice of methodological analysis the estimation of the real disunity and diversity of the special branches of science differs. Instead of Carnap's ideal of a unified science from the new pluralistic point of view the evaluation of the empirical multiplicity of particular sciences obtains philosophical acceptance.

  12. Ultrafast all-optical arithmetic logic based on hydrogenated amorphous silicon microring resonators

    NASA Astrophysics Data System (ADS)

    Gostimirovic, Dusan; Ye, Winnie N.

    2016-03-01

    For decades, the semiconductor industry has been steadily shrinking transistor sizes to fit more performance into a single silicon-based integrated chip. This technology has become the driving force for advances in education, transportation, and health, among others. However, transistor sizes are quickly approaching their physical limits (channel lengths are now only a few silicon atoms in length), and Moore's law will likely soon be brought to a stand-still despite many unique attempts to keep it going (FinFETs, high-k dielectrics, etc.). This technology must then be pushed further by exploring (almost) entirely new methodologies. Given the explosive growth of optical-based long-haul telecommunications, we look to apply the use of high-speed optics as a substitute to the digital model; where slow, lossy, and noisy metal interconnections act as a major bottleneck to performance. We combine the (nonlinear) optical Kerr effect with a single add-drop microring resonator to perform the fundamental AND-XOR logical operations of a half adder, by all-optical means. This process is also applied to subtraction, higher-order addition, and the realization of an all-optical arithmetic logic unit (ALU). The rings use hydrogenated amorphous silicon as a material with superior nonlinear properties to crystalline silicon, while still maintaining CMOS-compatibility and the many benefits that come with it (low cost, ease of fabrication, etc.). Our method allows for multi-gigabit-per-second data rates while maintaining simplicity and spatial minimalism in design for high-capacity manufacturing potential.

  13. Neither logical empiricism nor vitalism, but organicism: what the philosophy of biology was.

    PubMed

    Nicholson, Daniel J; Gawne, Richard

    2015-12-01

    Philosophy of biology is often said to have emerged in the last third of the twentieth century. Prior to this time, it has been alleged that the only authors who engaged philosophically with the life sciences were either logical empiricists who sought to impose the explanatory ideals of the physical sciences onto biology, or vitalists who invoked mystical agencies in an attempt to ward off the threat of physicochemical reduction. These schools paid little attention to actual biological science, and as a result philosophy of biology languished in a state of futility for much of the twentieth century. The situation, we are told, only began to change in the late 1960s and early 1970s, when a new generation of researchers began to focus on problems internal to biology, leading to the consolidation of the discipline. In this paper we challenge this widely accepted narrative of the history of philosophy of biology. We do so by arguing that the most important tradition within early twentieth-century philosophy of biology was neither logical empiricism nor vitalism, but the organicist movement that flourished between the First and Second World Wars. We show that the organicist corpus is thematically and methodologically continuous with the contemporary literature in order to discredit the view that early work in the philosophy of biology was unproductive, and we emphasize the desirability of integrating the historical and contemporary conversations into a single, unified discourse.

  14. Decision support model for assessing aquifer pollution hazard and prioritizing groundwater resources management in the wet Pampa plain, Argentina.

    PubMed

    Lima, M Lourdes; Romanelli, Asunción; Massone, Héctor E

    2013-06-01

    This paper gives an account of the implementation of a decision support system for assessing aquifer pollution hazard and prioritizing subwatersheds for groundwater resources management in the southeastern Pampa plain of Argentina. The use of this system is demonstrated with an example from Dulce Stream Basin (1,000 km(2) encompassing 27 subwatersheds), which has high level of agricultural activities and extensive available data regarding aquifer geology. In the logic model, aquifer pollution hazard is assessed as a function of two primary topics: groundwater and soil conditions. This logic model shows the state of each evaluated landscape with respect to aquifer pollution hazard based mainly on the parameters of the DRASTIC and GOD models. The decision model allows prioritizing subwatersheds for groundwater resources management according to three main criteria including farming activities, agrochemical application, and irrigation use. Stakeholder participation, through interviews, in combination with expert judgment was used to select and weight each criterion. The resulting subwatershed priority map, by combining the logic and decision models, allowed identifying five subwatersheds in the upper and middle basin as the main aquifer protection areas. The results reasonably fit the natural conditions of the basin, identifying those subwatersheds with shallow water depth, loam-loam silt texture soil media and pasture land cover in the middle basin, and others with intensive agricultural activity, coinciding with the natural recharge area to the aquifer system. Major difficulties and some recommendations of applying this methodology in real-world situations are discussed.

  15. Active control of flexible structures using a fuzzy logic algorithm

    NASA Astrophysics Data System (ADS)

    Cohen, Kelly; Weller, Tanchum; Ben-Asher, Joseph Z.

    2002-08-01

    This study deals with the development and application of an active control law for the vibration suppression of beam-like flexible structures experiencing transient disturbances. Collocated pairs of sensors/actuators provide active control of the structure. A design methodology for the closed-loop control algorithm based on fuzzy logic is proposed. First, the behavior of the open-loop system is observed. Then, the number and locations of collocated actuator/sensor pairs are selected. The proposed control law, which is based on the principles of passivity, commands the actuator to emulate the behavior of a dynamic vibration absorber. The absorber is tuned to a targeted frequency, whereas the damping coefficient of the dashpot is varied in a closed loop using a fuzzy logic based algorithm. This approach not only ensures inherent stability associated with passive absorbers, but also circumvents the phenomenon of modal spillover. The developed controller is applied to the AFWAL/FIB 10 bar truss. Simulated results using MATLAB© show that the closed-loop system exhibits fairly quick settling times and desirable performance, as well as robustness characteristics. To demonstrate the robustness of the control system to changes in the temporal dynamics of the flexible structure, the transient response to a considerably perturbed plant is simulated. The modal frequencies of the 10 bar truss were raised as well as lowered substantially, thereby significantly perturbing the natural frequencies of vibration. For these cases, too, the developed control law provides adequate settling times and rates of vibrational energy dissipation.

  16. Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Weaver, Jesse R.

    In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexitymore » and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.« less

  17. Fluid Intelligence and Psychosocial Outcome: From Logical Problem Solving to Social Adaptation

    PubMed Central

    Huepe, David; Roca, María; Salas, Natalia; Canales-Johnson, Andrés; Rivera-Rei, Álvaro A.; Zamorano, Leandro; Concepción, Aimée; Manes, Facundo; Ibañez, Agustín

    2011-01-01

    Background While fluid intelligence has proved to be central to executive functioning, logical reasoning and other frontal functions, the role of this ability in psychosocial adaptation has not been well characterized. Methodology/Principal Findings A random-probabilistic sample of 2370 secondary school students completed measures of fluid intelligence (Raven's Progressive Matrices, RPM) and several measures of psychological adaptation: bullying (Delaware Bullying Questionnaire), domestic abuse of adolescents (Conflict Tactic Scale), drug intake (ONUDD), self-esteem (Rosenberg's Self Esteem Scale) and the Perceived Mental Health Scale (Spanish adaptation). Lower fluid intelligence scores were associated with physical violence, both in the role of victim and victimizer. Drug intake, especially cannabis, cocaine and inhalants and lower self-esteem were also associated with lower fluid intelligence. Finally, scores on the perceived mental health assessment were better when fluid intelligence scores were higher. Conclusions/Significance Our results show evidence of a strong association between psychosocial adaptation and fluid intelligence, suggesting that the latter is not only central to executive functioning but also forms part of a more general capacity for adaptation to social contexts. PMID:21957464

  18. On the Correct Analysis of the Foundations of Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2007-04-01

    The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.

  19. Breast tumor malignancy modelling using evolutionary neural logic networks.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia

    2006-01-01

    The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.

  20. OPC model generation procedure for different reticle vendors

    NASA Astrophysics Data System (ADS)

    Jost, Andrew M.; Belova, Nadya; Callan, Neal P.

    2003-12-01

    The challenge of delivering acceptable semiconductor products to customers in timely fashion becomes more difficult as design complexity increases. The requirements of current generation designs tax OPC engineers greater than ever before since the readiness of high-quality OPC models can delay new process qualifications or lead to respins, which add to the upward-spiraling costs of new reticle sets, extend time-to-market, and disappoint customers. In their efforts to extend the printability of new designs, OPC engineers generally focus on the data-to-wafer path, ignoring data-to-mask effects almost entirely. However, it is unknown whether reticle makers' disparate processes truly yield comparable reticles, even with identical tools. This approach raises the question of whether a single OPC model is applicable to all reticle vendors. LSI Logic has developed a methodology for quantifying vendor-to-vendor reticle manufacturing differences and adapting OPC models for use at several reticle vendors. This approach allows LSI Logic to easily adapt existing OPC models for use with several reticle vendors and obviates the generation of unnecessary models, allowing OPC engineers to focus their efforts on the most critical layers.

  1. Identification of multiple intelligences for high school students in theoretical and applied science courses

    NASA Astrophysics Data System (ADS)

    Wiseman, D. Kim

    Historically educators in the United States have used the Stanford-Binet intelligence test to measure a students' ability in logical/mathematical and linguistic domains. This measurement is being used by a society that has evolved from agrarian and industrial-based economies to what is presently labeled a technological society. As society has changed so have the educational needs of the students who will live in this technological society. This study assessed the multiple intelligences of high school students enrolled in theoretical and applied science (physics and applied physics) courses. Studies have verified that performance and outcomes of students enrolled in these courses are similar in standardized testing but instructional methodology and processes are dissimilar. Analysis of multiple intelligence profiles collected from this study found significant differences in logical/mathematical, bodily/kinesthetic and intrapersonal multiple intelligences of students in theoretical science courses compared to students in applied science courses. Those differences clearly illustrate why it is imperative for educators to expand the definition of intelligence for students entering the new millennium.

  2. Emergency Preparedness technology support to the Health and Safety Executive (HSE), Nuclear Installations Inspectorate (NII) of the United Kingdom. Appendix A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.

    1994-03-01

    The Nuclear Installations Inspectorate (NII) of the United Kingdom (UK) suggested the use of an accident progression logic model method developed by Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) for K Reactor to predict the magnitude and timing of radioactivity releases (the source term) based on an advanced logic model methodology. Predicted releases are output from the personal computer-based model in a level-of-confidence format. Additional technical discussions eventually led to a request from the NII to develop a proposal for assembling a similar technology to predict source terms for the UK`s advanced gas-cooled reactor (AGR) type.more » To respond to this request, WSRC is submitting a proposal to provide contractual assistance as specified in the Scope of Work. The work will produce, document, and transfer technology associated with a Decision-Oriented Source Term Estimator for Emergency Preparedness (DOSE-EP) for the NII to apply to AGRs in the United Kingdom. This document, Appendix A is a part of this proposal.« less

  3. Problemes theoriques et methodologiques dans l'etude des langues/dialectes en contact aux niveaux macrologique et micrologique = Theoretical and Methodological Issues in the Study of Languages/Dialects in Contact at Macro- and Micro-Logical Levels of Analysis. Proceedings of the International Conference DALE (University of London)/ICRB (Laval University, Quebec)/ICSBT (Vrije Universiteit te Brussel) (London, England, May 23-26, 1985).

    ERIC Educational Resources Information Center

    Blanc, Michel, Ed.; Hamers, Josiane F., Ed.

    Papers from an international conference on the interaction of languages and dialects in contact are presented in this volume. Papers include: "Quelques reflexions sur la variation linguistique"; "The Investigation of 'Language Continuum' and 'Diglossia': A Macrological Case Study and Theoretical Model"; "A Survey of…

  4. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Appendix B: ROBSIM programmer's guide

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.

    1986-01-01

    The purpose of the Robotic Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAM 77 and implemented on a VAX 11/750 computer using the VMS operating system. The programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With the manual and the in-code documentation, an experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.

  5. An Analysis of Software Design Methodologies

    DTIC Science & Technology

    1979-08-01

    L[ I + C T 8L1j~j)+ T 2M •_.P and "or" symbols , and with explicit indications of iteration. For example, Figure 5a (from Bell et al, 1977...contains a structure chart with logical "and" ("&") and ."or" ("+’) symbols . Figure 5b illustrates Jacksin’s (1977) approach, in which asterisks ("*") are...is suggested. Such a data flow graph is illustrated in Figure 14. In this case, "T" is the TRANSACTION CENTER and the "(D " symbol is used to indi

  6. ART/Ada design project, phase 1. Task 2 report: Detailed design

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    Various issues are studied in the context of the design of an Ada based expert system building tool. Using an existing successful design as a starting point, the impact is analyzed of the Ada language and Ada development methodologies on that design, the Ada system is redesigned, and its performance is analyzed using both complexity-theoretic and empirical techniques. The algorithms specified in the overall design are refined, resolving and documenting any open design issues, identifying each system module, documenting the internal architecture and control logic, and describing the primary data structures involved in the module.

  7. Insights into the concept of fish welfare.

    PubMed

    Volpato, Gilson Luiz; Gonçalves-de-Freitas, Eliane; Fernandes-de-Castilho, Marisa

    2007-05-04

    Fish welfare issues are predicated on understanding whether fish are sentient beings. Therefore, we analyzed the logic of the methodologies used for studying this attribute. We conclude that empirical science is unable to prove or to disprove that fish are sentient beings. Thus, we propose a combined ethical-scientific approach for considering fish as sentient beings. The most difficult ongoing question is to determine which conditions fish prefer. Approaches to assess fish preferences should be rigorously and cautiously employed. In light of these considerations, attempts to establish physiological standards for fish welfare are discouraged, and a preference-based definition of fish welfare is proposed.

  8. Microprocessor design for GaAs technology

    NASA Astrophysics Data System (ADS)

    Milutinovic, Veljko M.

    Recent advances in the design of GaAs microprocessor chips are examined in chapters contributed by leading experts; the work is intended as reading material for a graduate engineering course or as a practical R&D reference. Topics addressed include the methodology used for the architecture, organization, and design of GaAs processors; GaAs device physics and circuit design; design concepts for microprocessor-based GaAs systems; a 32-bit GaAs microprocessor; a 32-bit processor implemented in GaAs JFET; and a direct coupled-FET-logic E/D-MESFET experimental RISC machine. Drawings, micrographs, and extensive circuit diagrams are provided.

  9. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Volume 1: Study results

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.

  10. Strategies and arguments of ergonomic design for sustainability.

    PubMed

    Marano, Antonio; Di Bucchianico, Giuseppe; Rossi, Emilio

    2012-01-01

    Referring to the discussion recently promoted by the Sub-Technical Committee n°4 "Ergonomics and design for sustainability", in this paper will be shown the early results of a theoretical and methodological study on Ergonomic design for sustainability. In particular, the research is based on the comparison between the common thematic structure characterizing Ergonomics, with the principles of Sustainable Development and with criteria adopted from other disciplines already oriented toward Sustainability. The paper identifies an early logical-interpretative model and describes possible and relevant Strategies of Ergonomic design for sustainability, which are connected in a series of specific Sustainable Arguments.

  11. A Methodology for Producing and Testing a Genesil Silicon Compiler Designed VLSI Chip Which Incorporates Design for Testability

    DTIC Science & Technology

    1990-09-01

    Monterey, California 93943-5000 Monterey, California 93943-5000 8a NAME OF OjNYNG SPONSORNc Br Oc.(C S VBO_ 9 POCAE’ ,S’ jN1N DE NT CA (’% . ORGANIZATON (If...position of the Depart- ment of Defense or the US Government. ś COSA I CODL> 18 S,,BjECT TERMS (Continue on reverse if necessar dno idenritj b blck...logic for which it was de - signed. Finally, the circuit should retain correct function- ality over time by having stable operating characteristics. If

  12. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, appendix B

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    The purpose of the Robotics Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAN 77 and implemented on a VAX 11/750 computer using the VMS operating system. This programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With this manual and the in-code documentation, and experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.

  13. Statechart-based design controllers for FPGA partial reconfiguration

    NASA Astrophysics Data System (ADS)

    Łabiak, Grzegorz; Wegrzyn, Marek; Rosado Muñoz, Alfredo

    2015-09-01

    Statechart diagram and UML technique can be a vital part of early conceptual modeling. At the present time there is no much support in hardware design methodologies for reconfiguration features of reprogrammable devices. Authors try to bridge the gap between imprecise UML model and formal HDL description. The key concept in author's proposal is to describe the behavior of the digital controller by statechart diagrams and to map some parts of the behavior into reprogrammable logic by means of group of states which forms sequential automaton. The whole process is illustrated by the example with experimental results.

  14. An experiment-based comparative study of fuzzy logic control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Chen, Yung-Yaw; Lee, Chuen-Chein; Murugesan, S.; Jang, Jyh-Shing

    1989-01-01

    An approach is presented to the control of a dynamic physical system through the use of approximate reasoning. The approach has been implemented in a program named POLE, and the authors have successfully built a prototype hardware system to solve the cartpole balancing problem in real-time. The approach provides a complementary alternative to the conventional analytical control methodology and is of substantial use when a precise mathematical model of the process being controlled is not available. A set of criteria for comparing controllers based on approximate reasoning and those based on conventional control schemes is furnished.

  15. RDTC [Restricted Data Transmission Controller] global variable definitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grambihler, A.J.; O`Callaghan, P.B.

    The purpose of the Restricted Data Transmission Controller (RDTC) is to demonstrate a methodology for transmitting data between computers which have different levels of classification. The RDTC does this by logically filtering the data being transmitted between the two computers. This prototype is set up to filter data from the classified computer so that only numeric data is passed to the unclassified computer. The RDTC allows all data from the unclassified computer to be sent to the classified computer. The classified system is referred to as LUA and the unclassified system is referred to as LUB. 9 tabs.

  16. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    PubMed

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  17. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    PubMed Central

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  18. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  19. A hybrid neural networks-fuzzy logic-genetic algorithm for grade estimation

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Hezarkhani, Ardeshir

    2012-05-01

    The grade estimation is a quite important and money/time-consuming stage in a mine project, which is considered as a challenge for the geologists and mining engineers due to the structural complexities in mineral ore deposits. To overcome this problem, several artificial intelligence techniques such as Artificial Neural Networks (ANN) and Fuzzy Logic (FL) have recently been employed with various architectures and properties. However, due to the constraints of both methods, they yield the desired results only under the specific circumstances. As an example, one major problem in FL is the difficulty of constructing the membership functions (MFs).Other problems such as architecture and local minima could also be located in ANN designing. Therefore, a new methodology is presented in this paper for grade estimation. This method which is based on ANN and FL is called "Coactive Neuro-Fuzzy Inference System" (CANFIS) which combines two approaches, ANN and FL. The combination of these two artificial intelligence approaches is achieved via the verbal and numerical power of intelligent systems. To improve the performance of this system, a Genetic Algorithm (GA) - as a well-known technique to solve the complex optimization problems - is also employed to optimize the network parameters including learning rate, momentum of the network and the number of MFs for each input. A comparison of these techniques (ANN, Adaptive Neuro-Fuzzy Inference System or ANFIS) with this new method (CANFIS-GA) is also carried out through a case study in Sungun copper deposit, located in East-Azerbaijan, Iran. The results show that CANFIS-GA could be a faster and more accurate alternative to the existing time-consuming methodologies for ore grade estimation and that is, therefore, suggested to be applied for grade estimation in similar problems.

  20. Realistic nurse-led policy implementation, optimization and evaluation: novel methodological exemplar.

    PubMed

    Noyes, Jane; Lewis, Mary; Bennett, Virginia; Widdas, David; Brombley, Karen

    2014-01-01

    To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy. Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation. Research methodology - Evaluation using theory-based realist methods for policy implementation. An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007-2008. Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled 'real-time' manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes. Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations 'naturally' without facilitated implementation and local optimization. © 2013 John Wiley & Sons Ltd.

  1. Conceptualising Animal Abuse with an Antisocial Behaviour Framework

    PubMed Central

    Gullone, Eleonora

    2011-01-01

    Simple Summary There is increasing acceptance of the links between animal abuse and aggressive or antisocial behaviours toward humans. Nevertheless, researchers and other professionals continue to call for methodologically sound empirical research amongst claims that current animal abuse research is methodologically limited. Below, I argue that current conceptualizations of antisocial and aggressive human behavior logically incorporate animal abuse. Given that the body of empirical evidence available to support of theories of antisocial and aggressive behaviour is large and sound, conceptualization of animal abuse as an aggressive behaviour rather than a behaviour that is somehow different, enables us to confidently promote putting current understanding into practice. Abstract This paper reviews current findings in the human aggression and antisocial behaviour literature and those in the animal abuse literature with the aim of highlighting the overlap in conceptualisation. The major aim of this review is to highlight that the co-occurrence between animal abuse behaviours and aggression and violence toward humans can be logically understood through examination of the research evidence for antisocial and aggressive behaviour. From examination through this framework, it is not at all surprising that the two co-occur. Indeed, it would be surprising if they did not. Animal abuse is one expression of antisocial behaviour. What is also known from the extensive antisocial behaviour literature is that antisocial behaviours co-occur such that the presence of one form of antisocial behaviour is highly predictive of the presence of other antisocial behaviours. From such a framework, it becomes evident that animal abuse should be considered an important indicator of antisocial behaviour and violence as are other aggressive and antisocial behaviours. The implications of such a stance are that law enforcement, health and other professionals should not minimize the presence of animal abuse in their law enforcement, prevention, and treatment decisions. PMID:26486220

  2. A hybrid neural networks-fuzzy logic-genetic algorithm for grade estimation

    PubMed Central

    Tahmasebi, Pejman; Hezarkhani, Ardeshir

    2012-01-01

    The grade estimation is a quite important and money/time-consuming stage in a mine project, which is considered as a challenge for the geologists and mining engineers due to the structural complexities in mineral ore deposits. To overcome this problem, several artificial intelligence techniques such as Artificial Neural Networks (ANN) and Fuzzy Logic (FL) have recently been employed with various architectures and properties. However, due to the constraints of both methods, they yield the desired results only under the specific circumstances. As an example, one major problem in FL is the difficulty of constructing the membership functions (MFs).Other problems such as architecture and local minima could also be located in ANN designing. Therefore, a new methodology is presented in this paper for grade estimation. This method which is based on ANN and FL is called “Coactive Neuro-Fuzzy Inference System” (CANFIS) which combines two approaches, ANN and FL. The combination of these two artificial intelligence approaches is achieved via the verbal and numerical power of intelligent systems. To improve the performance of this system, a Genetic Algorithm (GA) – as a well-known technique to solve the complex optimization problems – is also employed to optimize the network parameters including learning rate, momentum of the network and the number of MFs for each input. A comparison of these techniques (ANN, Adaptive Neuro-Fuzzy Inference System or ANFIS) with this new method (CANFIS–GA) is also carried out through a case study in Sungun copper deposit, located in East-Azerbaijan, Iran. The results show that CANFIS–GA could be a faster and more accurate alternative to the existing time-consuming methodologies for ore grade estimation and that is, therefore, suggested to be applied for grade estimation in similar problems. PMID:25540468

  3. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  4. Eligibility criteria in systematic reviews published in prominent medical journals: a methodological review.

    PubMed

    McCrae, Niall; Purssell, Edward

    2015-12-01

    Clear and logical eligibility criteria are fundamental to the design and conduct of a systematic review. This methodological review examined the quality of reporting and application of eligibility criteria in systematic reviews published in three leading medical journals. All systematic reviews in the BMJ, JAMA and The Lancet in the years 2013 and 2014 were extracted. These were assessed using a refined version of a checklist previously designed by the authors. A total of 113 papers were eligible, of which 65 were in BMJ, 17 in The Lancet and 31 in JAMA. Although a generally high level of reporting was found, eligibility criteria were often problematic. In 67% of papers, eligibility was specified after the search sources or terms. Unjustified time restrictions were used in 21% of reviews, and unpublished or unspecified data in 27%. Inconsistency between journals was apparent in the requirements for systematic reviews. The quality of reviews in these leading medical journals was high; however, there were issues that reduce the clarity and replicability of the review process. As well as providing a useful checklist, this methodological review informs the continued development of standards for systematic reviews. © 2015 John Wiley & Sons, Ltd.

  5. Mapping the petroleum system - An investigative technique to explore the hydrocarbon fluid system

    USGS Publications Warehouse

    Magoon, L.B.; Dow, W.G.

    2000-01-01

    Creating a petroleum system map includes a series of logical steps that require specific information to explain the origin in time and space of discovered hydrocarbon occurrences. If used creatively, this map provides a basis on which to develop complementary plays and prospects. The logical steps include the characterization of a petroleum system (that is, to identify, map, and name the hydrocarbon fluid system) and the summary of these results on a folio sheet. A petroleum system map is based on the understanding that there are several levels of certainty from "guessing" to "knowing" that specific oil and gas accumulations emanated from a particular pod of active source rock. Levels of certainty start with the close geographic proximity of two or more accumulations, continues with the close stratigraphic proximity, followed by the similarities in bulk properties, and then detailed geochemical properties. The highest level of certainty includes the positive geochemical correlation of the hydrocarbon fluid in the accumulations to the extract of the active source rock. A petroleum system map is created when the following logic is implemented. Implementation starts when the oil and gas accumulations of a petroleum province are grouped stratigraphically and geographically. Bulk and geochemical properties are used to further refine the groups through the determination of genetically related oil and gas types. To this basic map, surface seeps and well shows are added. Similarly, the active source rock responsible for these hydrocarbon occurrences are mapped to further define the extent of the system. A folio sheet constructed for a hypothetical case study of the Deer-Boar(.) petroleum system illustrates this methodology.

  6. Target Control in Logical Models Using the Domain of Influence of Nodes.

    PubMed

    Yang, Gang; Gómez Tejeda Zañudo, Jorge; Albert, Réka

    2018-01-01

    Dynamical models of biomolecular networks are successfully used to understand the mechanisms underlying complex diseases and to design therapeutic strategies. Network control and its special case of target control, is a promising avenue toward developing disease therapies. In target control it is assumed that a small subset of nodes is most relevant to the system's state and the goal is to drive the target nodes into their desired states. An example of target control would be driving a cell to commit to apoptosis (programmed cell death). From the experimental perspective, gene knockout, pharmacological inhibition of proteins, and providing sustained external signals are among practical intervention techniques. We identify methodologies to use the stabilizing effect of sustained interventions for target control in Boolean network models of biomolecular networks. Specifically, we define the domain of influence (DOI) of a node (in a certain state) to be the nodes (and their corresponding states) that will be ultimately stabilized by the sustained state of this node regardless of the initial state of the system. We also define the related concept of the logical domain of influence (LDOI) of a node, and develop an algorithm for its identification using an auxiliary network that incorporates the regulatory logic. This way a solution to the target control problem is a set of nodes whose DOI can cover the desired target node states. We perform greedy randomized adaptive search in node state space to find such solutions. We apply our strategy to in silico biological network models of real systems to demonstrate its effectiveness.

  7. Design and implementation of projects with Xilinx Zynq FPGA: a practical case

    NASA Astrophysics Data System (ADS)

    Travaglini, R.; D'Antone, I.; Meneghini, S.; Rignanese, L.; Zuffa, M.

    The main advantage when using FPGAs with embedded processors is the availability of additional several high-performance resources in the same physical device. Moreover, the FPGA programmability allows for connect custom peripherals. Xilinx have designed a programmable device named Zynq-7000 (simply called Zynq in the following), which integrates programmable logic (identical to the other Xilinx "serie 7" devices) with a System on Chip (SOC) based on two embedded ARM processors. Since both parts are deeply connected, the designers benefit from performance of hardware SOC and flexibility of programmability as well. In this paper a design developed by the Electronic Design Department at the Bologna Division of INFN will be presented as a practical case of project based on Zynq device. It is developed by using a commercial board called ZedBoard hosting a FMC mezzanine with a 12-bit 500 MS/s ADC. The Zynq FPGA on the ZedBoard receives digital outputs from the ADC and send them to the acquisition PC, after proper formatting, through a Gigabit Ethernet link. The major focus of the paper will be about the methodology to develop a Zynq-based design with the Xilinx Vivado software, enlightening how to configure the SOC and connect it with the programmable logic. Firmware design techniques will be presented: in particular both VHDL and IP core based strategies will be discussed. Further, the procedure to develop software for the embedded processor will be presented. Finally, some debugging tools, like the embedded Logic Analyzer, will be shown. Advantages and disadvantages with respect to adopting FPGA without embedded processors will be discussed.

  8. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.

  9. Minimum energy dissipation required for a logically irreversible operation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Yoshikawa, Nobuyuki

    2018-01-01

    According to Landauer's principle, the minimum heat emission required for computing is linked to logical entropy, or logical reversibility. The validity of Landauer's principle has been investigated for several decades and was finally demonstrated in recent experiments by showing that the minimum heat emission is associated with the reduction in logical entropy during a logically irreversible operation. Although the relationship between minimum heat emission and logical reversibility is being revealed, it is not clear how much free energy is required to be dissipated for a logically irreversible operation. In the present study, in order to reveal the connection between logical reversibility and free energy dissipation, we numerically demonstrated logically irreversible protocols using adiabatic superconductor logic. The calculation results of work during the protocol showed that, while the minimum heat emission conforms to Landauer's principle, the free energy dissipation can be arbitrarily reduced by performing the protocol quasistatically. The above results show that logical reversibility is not associated with thermodynamic reversibility, and that heat is not only emitted from logic devices but also absorbed by logic devices. We also formulated the heat emission from adiabatic superconductor logic during a logically irreversible operation at a finite operation speed.

  10. Modelling of Robotized Manufacturing Systems Using MultiAgent Formalism

    NASA Astrophysics Data System (ADS)

    Foit, K.; Gwiazda, A.; Banaś, W.

    2016-08-01

    The evolution of manufacturing systems has greatly accelerated due to development of sophisticated control systems. On top of determined, one way production flow the need of decision making has arisen as a result of growing product range that are manufactured simultaneously, using the same resources. On the other hand, the intelligent flow control could address the “bottleneck” problem caused by the machine failure. This sort of manufacturing systems uses advanced control algorithms that are introduced by the use of logic controllers. The complex algorithms used in the control systems requires to employ appropriate methods during the modelling process, like the agent-based one, which is the subject of this paper. The concept of an agent is derived from the object-based methodology of modelling, so it meets the requirements of representing the physical properties of the machines as well as the logical form of control systems. Each agent has a high level of autonomy and could be considered separately. The multi-agent system consists of minimum two agents that can interact and modify the environment, where they act. This may lead to the creation of self-organizing structure, what could be interesting feature during design and test of manufacturing system.

  11. Computer-aided biochemical programming of synthetic microreactors as diagnostic devices.

    PubMed

    Courbet, Alexis; Amar, Patrick; Fages, François; Renard, Eric; Molina, Franck

    2018-04-26

    Biological systems have evolved efficient sensing and decision-making mechanisms to maximize fitness in changing molecular environments. Synthetic biologists have exploited these capabilities to engineer control on information and energy processing in living cells. While engineered organisms pose important technological and ethical challenges, de novo assembly of non-living biomolecular devices could offer promising avenues toward various real-world applications. However, assembling biochemical parts into functional information processing systems has remained challenging due to extensive multidimensional parameter spaces that must be sampled comprehensively in order to identify robust, specification compliant molecular implementations. We introduce a systematic methodology based on automated computational design and microfluidics enabling the programming of synthetic cell-like microreactors embedding biochemical logic circuits, or protosensors , to perform accurate biosensing and biocomputing operations in vitro according to temporal logic specifications. We show that proof-of-concept protosensors integrating diagnostic algorithms detect specific patterns of biomarkers in human clinical samples. Protosensors may enable novel approaches to medicine and represent a step toward autonomous micromachines capable of precise interfacing of human physiology or other complex biological environments, ecosystems, or industrial bioprocesses. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  12. Knowledge-based control for robot self-localization

    NASA Technical Reports Server (NTRS)

    Bennett, Bonnie Kathleen Holte

    1993-01-01

    Autonomous robot systems are being proposed for a variety of missions including the Mars rover/sample return mission. Prior to any other mission objectives being met, an autonomous robot must be able to determine its own location. This will be especially challenging because location sensors like GPS, which are available on Earth, will not be useful, nor will INS sensors because their drift is too large. Another approach to self-localization is required. In this paper, we describe a novel approach to localization by applying a problem solving methodology. The term 'problem solving' implies a computational technique based on logical representational and control steps. In this research, these steps are derived from observing experts solving localization problems. The objective is not specifically to simulate human expertise but rather to apply its techniques where appropriate for computational systems. In doing this, we describe a model for solving the problem and a system built on that model, called localization control and logic expert (LOCALE), which is a demonstration of concept for the approach and the model. The results of this work represent the first successful solution to high-level control aspects of the localization problem.

  13. Critical Analysis of the Mathematical Formalism of Theoretical Physics. I. Foundations of Differential and Integral Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2013-04-01

    Critical analysis of the standard foundations of differential and integral calculus -- as mathematical formalism of theoretical physics -- is proposed. Methodological basis of the analysis is the unity of formal logic and rational dialectics. It is shown that: (a) the foundations (i.e. d 1ptyd,;=;δ,;->;0,;δ,δ,, δ,;->;0;δ,δ,;=;δ,;->;0;f,( x;+;δ, );-;f,( x )δ,;, d,;=;δ,, d,;=;δ, where y;=;f,( x ) is a continuous function of one argument x; δ, and δ, are increments; d, and d, are differentials) not satisfy formal logic law -- the law of identity; (b) the infinitesimal quantities d,, d, are fictitious quantities. They have neither algebraic meaning, nor geometrical meaning because these quantities do not take numerical values and, therefore, have no a quantitative measure; (c) expressions of the kind x;+;d, are erroneous because x (i.e. finite quantity) and d, (i.e. infinitely diminished quantity) have different sense, different qualitative determinacy; since x;,;,,,,onst under δ,;,;,, a derivative does not contain variable quantity x and depends only on constant c. Consequently, the standard concepts ``infinitesimal quantity (uninterruptedly diminishing quantity)'', ``derivative'', ``derivative as function of variable quantity'' represent incorrect basis of mathematics and theoretical physics.

  14. Heat wave hazard classification and risk assessment using artificial intelligence fuzzy logic.

    PubMed

    Keramitsoglou, Iphigenia; Kiranoudis, Chris T; Maiheu, Bino; De Ridder, Koen; Daglis, Ioannis A; Manunta, Paolo; Paganini, Marc

    2013-10-01

    The average summer temperatures as well as the frequency and intensity of hot days and heat waves are expected to increase due to climate change. Motivated by this consequence, we propose a methodology to evaluate the monthly heat wave hazard and risk and its spatial distribution within large cities. A simple urban climate model with assimilated satellite-derived land surface temperature images was used to generate a historic database of urban air temperature fields. Heat wave hazard was then estimated from the analysis of these hourly air temperatures distributed at a 1-km grid over Athens, Greece, by identifying the areas that are more likely to suffer higher temperatures in the case of a heat wave event. Innovation lies in the artificial intelligence fuzzy logic model that was used to classify the heat waves from mild to extreme by taking into consideration their duration, intensity and time of occurrence. The monthly hazard was subsequently estimated as the cumulative effect from the individual heat waves that occurred at each grid cell during a month. Finally, monthly heat wave risk maps were produced integrating geospatial information on the population vulnerability to heat waves calculated from socio-economic variables.

  15. Fault-tolerant computer study. [logic designs for building block circuits

    NASA Technical Reports Server (NTRS)

    Rennels, D. A.; Avizienis, A. A.; Ercegovac, M. D.

    1981-01-01

    A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed.

  16. Financial auditing at enterprises for control of projects realized with credit fund-raising

    NASA Astrophysics Data System (ADS)

    Lukmanova, Inessa

    2017-10-01

    The article analyzes methods of conducting financial audit under the construction control of projects implemented with raising credit funds in modern conditions. This work aims to improve the methodological toolkit of construction control when lending projects of the construction of transport infrastructure. The paper considers correlations of various procedures of construction control, financial audit and organizational and technical factors affecting investment and construction projects. The authors presented the logical scheme of the process of lending to legal entities and developed an algorithm of the procedure for conducting a financial audit, allowing to make possible adjustments and the right decision.

  17. A mixture gatekeeping procedure based on the Hommel test for clinical trial applications.

    PubMed

    Brechenmacher, Thomas; Xu, Jane; Dmitrienko, Alex; Tamhane, Ajit C

    2011-07-01

    When conducting clinical trials with hierarchically ordered objectives, it is essential to use multiplicity adjustment methods that control the familywise error rate in the strong sense while taking into account the logical relations among the null hypotheses. This paper proposes a gatekeeping procedure based on the Hommel (1988) test, which offers power advantages compared to other p value-based tests proposed in the literature. A general description of the procedure is given and details are presented on how it can be applied to complex clinical trial designs. Two clinical trial examples are given to illustrate the methodology developed in the paper.

  18. Bayesics

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2005-11-01

    This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth

  19. You can only die thrice: death and dying of a human body in psychoanalytical perspective.

    PubMed

    Sterk, Karmen

    2010-12-01

    This paper compares the (cultural) necessity of death/dying, perceived as a sequence of Imaginary--Real--Symbolic, to Van Gennep's three-staged rite of passage. If this logic is disrupted, the subject responsible necessitates attribution of special social status and can come to embody the imagery of a life worth living. This philosophical framework, which includes epistemologies borrowed from medical anthropology, demonstrates there is more for humans to lose than biological (Real) life; a far greater loss is to exist without (Symbolic) reason to live. A critique of prevalent quantitative methodology in assessing links between spirituality and the human body is added.

  20. Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan

    2012-01-01

    Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).

  1. Can openEHR archetypes be used in a national context? The Danish archetype proof-of-concept project.

    PubMed

    Bernstein, Knut; Tvede, Ida; Petersen, Jan; Bredegaard, Kirsten

    2009-01-01

    Semantic interoperability and secondary use of data are important informatics challenges in modern healthcare. Connected Digital Health Denmark is investigating if the openEHR reference model, archetypes and templates could be used for representing and exchanging clinical content specification and could become a candidate for a national logical infrastructure for semantic interoperability. The Danish archetype proof-of-concept project has tried out some elements of the openEHR methodology in cooperation with regions and vendors. The project has pointed out benefits and challenges using archetypes, and has identified barriers that need to be addressed in the next steps.

  2. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  3. Simulated fault injection - A methodology to evaluate fault tolerant microprocessor architectures

    NASA Technical Reports Server (NTRS)

    Choi, Gwan S.; Iyer, Ravishankar K.; Carreno, Victor A.

    1990-01-01

    A simulation-based fault-injection method for validating fault-tolerant microprocessor architectures is described. The approach uses mixed-mode simulation (electrical/logic analysis), and injects transient errors in run-time to assess the resulting fault impact. As an example, a fault-tolerant architecture which models the digital aspects of a dual-channel real-time jet-engine controller is used. The level of effectiveness of the dual configuration with respect to single and multiple transients is measured. The results indicate 100 percent coverage of single transients. Approximately 12 percent of the multiple transients affect both channels; none result in controller failure since two additional levels of redundancy exist.

  4. Reasoning about Function Objects

    NASA Astrophysics Data System (ADS)

    Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian

    Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.

  5. The continuum fusion theory of signal detection applied to a bi-modal fusion problem

    NASA Astrophysics Data System (ADS)

    Schaum, A.

    2011-05-01

    A new formalism has been developed that produces detection algorithms for model-based problems, in which one or more parameter values is unknown. Continuum Fusion can be used to generate different flavors of algorithm for any composite hypothesis testing problem. The methodology is defined by a fusion logic that can be translated into max/min conditions. Here it is applied to a simple sensor fusion model, but one for which the generalized likelihood ratio test is intractable. By contrast, a fusion-based response to the same problem can be devised that is solvable in closed form and represents a good approximation to the GLR test.

  6. Semantic MEDLINE for Discovery Browsing: Using Semantic Predications and the Literature-Based Discovery Paradigm to Elucidate a Mechanism for the Obesity Paradox

    PubMed Central

    Cairelli, Michael J.; Miller, Christopher M.; Fiszman, Marcelo; Workman, T. Elizabeth; Rindflesch, Thomas C.

    2013-01-01

    Applying the principles of literature-based discovery (LBD), we elucidate the paradox that obesity is beneficial in critical care despite contributing to disease generally. Our approach enhances a previous extension to LBD, called “discovery browsing,” and is implemented using Semantic MEDLINE, which summarizes the results of a PubMed search into an interactive graph of semantic predications. The methodology allows a user to construct argumentation underpinning an answer to a biomedical question by engaging the user in an iterative process between system output and user knowledge. Components of the Semantic MEDLINE output graph identified as “interesting” by the user both contribute to subsequent searches and are constructed into a logical chain of relationships constituting an explanatory network in answer to the initial question. Based on this methodology we suggest that phthalates leached from plastic in critical care interventions activate PPAR gamma, which is anti-inflammatory and abundant in obese patients. PMID:24551329

  7. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  8. Avoiding reification. Heuristic effectiveness of mathematics and the prediction of the Ω- particle

    NASA Astrophysics Data System (ADS)

    Ginammi, Michele

    2016-02-01

    According to Steiner (1998), in contemporary physics new important discoveries are often obtained by means of strategies which rely on purely formal mathematical considerations. In such discoveries, mathematics seems to have a peculiar and controversial role, which apparently cannot be accounted for by means of standard methodological criteria. M. Gell-Mann and Y. Ne'eman's prediction of the Ω- particle is usually considered a typical example of application of this kind of strategy. According to Bangu (2008), this prediction is apparently based on the employment of a highly controversial principle-what he calls the "reification principle". Bangu himself takes this principle to be methodologically unjustifiable, but still indispensable to make the prediction logically sound. In the present paper I will offer a new reconstruction of the reasoning that led to this prediction. By means of this reconstruction, I will show that we do not need to postulate any "reificatory" role of mathematics in contemporary physics and I will contextually clarify the representative and heuristic role of mathematics in science.

  9. A method to identify and analyze biological programs through automated reasoning

    PubMed Central

    Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen

    2016-01-01

    Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090

  10. Method and Apparatus for Simultaneous Processing of Multiple Functions

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Andrei, Radu (Inventor)

    2017-01-01

    Electronic logic gates that operate using N logic state levels, where N is greater than 2, and methods of operating such gates. The electronic logic gates operate according to truth tables. At least two input signals each having a logic state that can range over more than two logic states are provided to the logic gates. The logic gates each provide an output signal that can have one of N logic states. Examples of gates described include NAND/NAND gates having two inputs A and B and NAND/NAND gates having three inputs A, B, and C, where A, B and C can take any of four logic states. Systems using such gates are described, and their operation illustrated. Optical logic gates that operate using N logic state levels are also described.

  11. Method and Apparatus for Simultaneous Processing of Multiple Functions

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Andrei, Radu (Inventor); Zhu, David (Inventor); Mojarradi, Mohammad Mehdi (Inventor); Vo, Tuan A. (Inventor)

    2015-01-01

    Electronic logic gates that operate using N logic state levels, where N is greater than 2, and methods of operating such gates. The electronic logic gates operate according to truth tables. At least two input signals each having a logic state that can range over more than two logic states are provided to the logic gates. The logic gates each provide an output signal that can have one of N logic states. Examples of gates described include NAND/NAND gates having two inputs A and B and NAND/NAND gates having three inputs A, B, and C, where A, B and C can take any of four logic states. Systems using such gates are described, and their operation illustrated. Optical logic gates that operate using N logic state levels are also described.

  12. Using Abductive Research Logic: "The Logic of Discovery", to Construct a Rigorous Explanation of Amorphous Evaluation Findings

    ERIC Educational Resources Information Center

    Levin-Rozalis, Miri

    2010-01-01

    Background: Two kinds of research logic prevail in scientific research: deductive research logic and inductive research logic. However, both fail in the field of evaluation, especially evaluation conducted in unfamiliar environments. Purpose: In this article I wish to suggest the application of a research logic--"abduction"--"the logic of…

  13. Application of linear logic to simulation

    NASA Astrophysics Data System (ADS)

    Clarke, Thomas L.

    1998-08-01

    Linear logic, since its introduction by Girard in 1987 has proven expressive and powerful. Linear logic has provided natural encodings of Turing machines, Petri nets and other computational models. Linear logic is also capable of naturally modeling resource dependent aspects of reasoning. The distinguishing characteristic of linear logic is that it accounts for resources; two instances of the same variable are considered differently from a single instance. Linear logic thus must obey a form of the linear superposition principle. A proportion can be reasoned with only once, unless a special operator is applied. Informally, linear logic distinguishes two kinds of conjunction, two kinds of disjunction, and also introduces a modal storage operator that explicitly indicates propositions that can be reused. This paper discuses the application of linear logic to simulation. A wide variety of logics have been developed; in addition to classical logic, there are fuzzy logics, affine logics, quantum logics, etc. All of these have found application in simulations of one sort or another. The special characteristics of linear logic and its benefits for simulation will be discussed. Of particular interest is a connection that can be made between linear logic and simulated dynamics by using the concept of Lie algebras and Lie groups. Lie groups provide the connection between the exponential modal storage operators of linear logic and the eigen functions of dynamic differential operators. Particularly suggestive are possible relations between complexity result for linear logic and non-computability results for dynamical systems.

  14. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    ERIC Educational Resources Information Center

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  15. Proposal and application of a regional methodology of comparative risk assessment for potentially contaminated sites.

    PubMed

    Marzocchini, Manrico; Tatàno, Fabio; Moretti, Michela Simona; Antinori, Caterina; Orilisi, Stefano

    2018-06-05

    A possible approach for determining soil and groundwater quality criteria for contaminated sites is the comparative risk assessment. Originating from but not limited to Italian interest in a decentralised (regional) implementation of comparative risk assessment, this paper first addresses the proposal of an original methodology called CORIAN REG-M , which was created with initial attention to the context of potentially contaminated sites in the Marche Region (Central Italy). To deepen the technical-scientific knowledge and applicability of the comparative risk assessment, the following characteristics of the CORIAN REG-M methodology appear to be relevant: the simplified but logical assumption of three categories of factors (source and transfer/transport of potential contamination, and impacted receptors) within each exposure pathway; the adaptation to quality and quantity of data that are available or derivable at the given scale of concern; the attention to a reliable but unsophisticated modelling; the achievement of a conceptual linkage to the absolute risk assessment approach; and the potential for easy updating and/or refining of the methodology. Further, the application of the CORIAN REG-M methodology to some case-study sites located in the Marche Region indicated the following: a positive correlation can be expected between air and direct contact pathway scores, as well as between individual pathway scores and the overall site scores based on a root-mean-square algorithm; the exposure pathway, which presents the highest variability of scores, tends to be dominant at sites with the highest computed overall site scores; and the adoption of a root-mean-square algorithm can be expected to emphasise the overall site scoring.

  16. [Role of an educational-and-methodological complex in the optimization of teaching at the stage of additional professional education of physicians in the specialty "anesthesiology and reanimatology"].

    PubMed

    Buniatian, A A; Sizova, Zh M; Vyzhigina, M A; Shikh, E V

    2010-01-01

    An educational-and-methodological complex (EMC) in the specialty 'Anesthesiology and Reanimatology", which promotes manageability, flexibility, and dynamism of an educational process, is of great importance in solving the problem in the systematization of knowledge and its best learning by physicians at a stage of additional professional education (APE). EMC is a set of educational-and-methodological materials required to organize and hold an educational process for the advanced training of anesthesiologists and resuscitation specialists at the stage of APE. EMC includes a syllabus for training in the area "Anesthesiology and Reanimatology" by the appropriate training pattern (certification cycles, topical advanced training cycles); a work program for training in the specialty "Anesthesiology and Reanimatology"; a work curriculums for training in allied specialties (surgery, traumatology and orthopedics, obstetrics and gynecology, and pediatrics); work programs on basic disciplines (pharmacology, normal and pathological physiology, normal anatomy, chemistry and biology); working programs on the area "Public health care and health care service", guidelines for the teacher; educational-and-methodological materials for the student; and quiz programs. The main point of EMC in the specialty "Anesthesiology and Reanimatology" is a work program. Thus, educational-and-methodological and teaching materials included into the EMC in the specialty 'Anesthesiology and Reanimatology" should envisage the logically successive exposition of a teaching material, the use of currently available methods and educational facilities, which facilitates the optimization of training of anesthesiologists and resuscitation specialists at the stage of APE.

  17. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  18. Dynamic Event Tree advancements and control logic improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less

  19. Practical proof of CP element based design for 14nm node and beyond

    NASA Astrophysics Data System (ADS)

    Maruyama, Takashi; Takita, Hiroshi; Ikeno, Rimon; Osawa, Morimi; Kojima, Yoshinori; Sugatani, Shinji; Hoshino, Hiromi; Hino, Toshio; Ito, Masaru; Iizuka, Tetsuya; Komatsu, Satoshi; Ikeda, Makoto; Asada, Kunihiro

    2013-03-01

    To realize HVM (High Volume Manufacturing) with CP (Character Projection) based EBDW, the shot count reduction is the essential key. All device circuits should be composed with predefined character parts and we call this methodology "CP element based design". In our previous work, we presented following three concepts [2]. 1) Memory: We reported the prospects of affordability for the CP-stencil resource. 2) Logic cell: We adopted a multi-cell clustering approach in the physical synthesis. 3) Random interconnect: We proposed an ultra-regular layout scheme using fixed size wiring tiles containing repeated tracks and cutting points at the tile edges. In this paper, we will report the experimental proofs in these methodologies. In full chip layout, CP stencil resource management is critical key. From the MCC-POC (Proof of Concept) result [1], we assumed total available CP stencil resource as 9000um2. We should manage to layout all circuit macros within this restriction. Especially the issues in assignment of CP-stencil resource for the memory macros are the most important as they consume considerable degree of resource because of the various line-ups such as 1RW-, 2RW-SRAMs, Resister Files and ROM which require several varieties of large size peripheral circuits. Furthermore the memory macros typically take large area of more than 40% of die area in the forefront logic LSI products so that the shot count increase impact is serious. To realize CP-stencil resource saving we had constructed automatic CP analyzing system. We developed two types of extraction mode of simple division by block and layout repeatability recognition. By properly controlling these models based upon each peripheral circuit characteristics, we could minimize the consumption of CP stencil resources. The estimation for 14nm technology node had been performed based on the analysis of practical memory compiler. The required resource for memory macro is proved to be affordable value which is 60% of full CP stencil resource and wafer level converted shot count is proved to be the level which meets 100WPH throughput. In logic cell design, circuit performance verification result after the cell clustering has been estimated. The cell clustering by the acknowledgment of physical distance proved to owe large penalty mainly in the wiring length. To reduce this design penalty, we proposed CP cell clustering by the acknowledgment of logical distance. For shot-count reduction of random interconnect area design, we proposed a more structural routing architecture which consists of the track exchange and the via position arrangement. Putting these design approaches together, we can design CP stencils to hit the target throughput within the area constraint. From the analysis for other macros such as analog, I/O, and DUMMY, it has proved that we don't need special CP design approach than legacy pattern matching CP extraction. From all these experimental results we get good prospects to the reality of full CP element based layout.

  20. Pass-transistor very large scale integration

    NASA Technical Reports Server (NTRS)

    Maki, Gary K. (Inventor); Bhatia, Prakash R. (Inventor)

    2004-01-01

    Logic elements are provided that permit reductions in layout size and avoidance of hazards. Such logic elements may be included in libraries of logic cells. A logical function to be implemented by the logic element is decomposed about logical variables to identify factors corresponding to combinations of the logical variables and their complements. A pass transistor network is provided for implementing the pass network function in accordance with this decomposition. The pass transistor network includes ordered arrangements of pass transistors that correspond to the combinations of variables and complements resulting from the logical decomposition. The logic elements may act as selection circuits and be integrated with memory and buffer elements.

  1. People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions.

    PubMed

    Nakamura, Hiroko; Kawaguchi, Jun

    2016-01-01

    Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners' feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners' feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people's intuitive interpretation of the conditional "if p then q" fits better with the conditional probability, q given p.

  2. Fuzzy Versions of Epistemic and Deontic Logic

    NASA Technical Reports Server (NTRS)

    Gounder, Ramasamy S.; Esterline, Albert C.

    1998-01-01

    Epistemic and deontic logics are modal logics, respectively, of knowledge and of the normative concepts of obligation, permission, and prohibition. Epistemic logic is useful in formalizing systems of communicating processes and knowledge and belief in AI (Artificial Intelligence). Deontic logic is useful in computer science wherever we must distinguish between actual and ideal behavior, as in fault tolerance and database integrity constraints. We here discuss fuzzy versions of these logics. In the crisp versions, various axioms correspond to various properties of the structures used in defining the semantics of the logics. Thus, any axiomatic theory will be characterized not only by its axioms but also by the set of properties holding of the corresponding semantic structures. Fuzzy logic does not proceed with axiomatic systems, but fuzzy versions of the semantic properties exist and can be shown to correspond to some of the axioms for the crisp systems in special ways that support dependency networks among assertions in a modal domain. This in turn allows one to implement truth maintenance systems. For the technical development of epistemic logic, and for that of deontic logic. To our knowledge, we are the first to address fuzzy epistemic and fuzzy deontic logic explicitly and to consider the different systems and semantic properties available. We give the syntax and semantics of epistemic logic and discuss the correspondence between axioms of epistemic logic and properties of semantic structures. The same topics are covered for deontic logic. Fuzzy epistemic and fuzzy deontic logic discusses the relationship between axioms and semantic properties for these logics. Our results can be exploited in truth maintenance systems.

  3. MOE vs. M&E: considering the difference between measuring strategic effectiveness and monitoring tactical evaluation.

    PubMed

    Diehl, Glen; Major, Solomon

    2015-01-01

    Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  4. Algorithms and theory for the design and programming of industrial control systems materialized with PLC's

    NASA Astrophysics Data System (ADS)

    Montoya Villena, Rafael

    According to its title, the general objective of the Thesis consists in developing a clear, simple and systematic methodology for programming type PLC devices. With this aim in mind, we will use the following elements: Codification of all variables types. This section is very important since it allows us working with little information. The necessary rules are given to codify all type of phrases produced in industrial processes. An algorithm that describes process evolution and that has been called process D.F. This is one of the most important contributions, since it will allow us, together with information codification, representing the process evolution in a graphic way and with any design theory used. Theory selection. Evidently, the use of some kind of design method is necessary to obtain logic equations. For this particular case, we will use binodal theory, an ideal theory for wired technologies, since it can obtain highly reduced schemas for relatively simple automatisms, which means a minimum number of components used. User program outline algorithm (D.F.P.). This is another necessary contribution and perhaps the most important one, since logic equations resulting from binodal theory are compatible with process evolution if wired technology is used, whether it is electric, electronic, pneumatic, etc. On the other hand, PLC devices performance characteristics force the program instructions order to validate or not the automatism, as we have proven in different articles and lectures at congresses both national and international. Therefore, we will codify any information concerning the automating process, graphically represent its temporal evolution and, applying binodal theory and D.F.P (previously adapted), succeed in making logic equations compatible with the process to be automated and the device in which they will be implemented (PLC in our case)

  5. People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions

    PubMed Central

    2016-01-01

    Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners’ feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners’ feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people’s intuitive interpretation of the conditional “if p then q” fits better with the conditional probability, q given p. PMID:28036402

  6. Radiation tolerant combinational logic cell

    NASA Technical Reports Server (NTRS)

    Maki, Gary R. (Inventor); Whitaker, Sterling (Inventor); Gambles, Jody W. (Inventor)

    2009-01-01

    A system has a reduced sensitivity to Single Event Upset and/or Single Event Transient(s) compared to traditional logic devices. In a particular embodiment, the system includes an input, a logic block, a bias stage, a state machine, and an output. The logic block is coupled to the input. The logic block is for implementing a logic function, receiving a data set via the input, and generating a result f by applying the data set to the logic function. The bias stage is coupled to the logic block. The bias stage is for receiving the result from the logic block and presenting it to the state machine. The state machine is coupled to the bias stage. The state machine is for receiving, via the bias stage, the result generated by the logic block. The state machine is configured to retain a state value for the system. The state value is typically based on the result generated by the logic block. The output is coupled to the state machine. The output is for providing the value stored by the state machine. Some embodiments of the invention produce dual rail outputs Q and Q'. The logic block typically contains combinational logic and is similar, in size and transistor configuration, to a conventional CMOS combinational logic design. However, only a very small portion of the circuits of these embodiments, is sensitive to Single Event Upset and/or Single Event Transients.

  7. Reversible logic gates on Physarum Polycephalum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schumann, Andrew

    2015-03-10

    In this paper, we consider possibilities how to implement asynchronous sequential logic gates and quantum-style reversible logic gates on Physarum polycephalum motions. We show that in asynchronous sequential logic gates we can erase information because of uncertainty in the direction of plasmodium propagation. Therefore quantum-style reversible logic gates are more preferable for designing logic circuits on Physarum polycephalum.

  8. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  9. Boolean logic tree of graphene-based chemical system for molecular computation and intelligent molecular search query.

    PubMed

    Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing

    2014-05-06

    The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.

  10. The case for multimodal analysis of atypical interaction: questions, answers and gaze in play involving a child with autism.

    PubMed

    Muskett, Tom; Body, Richard

    2013-01-01

    Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.

  11. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    USGS Publications Warehouse

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  12. Critical reflections on realist review: insights from customizing the methodology to the needs of participatory research assessment.

    PubMed

    Jagosh, Justin; Pluye, Pierre; Wong, Geoff; Cargo, Margaret; Salsberg, Jon; Bush, Paula L; Herbert, Carol P; Green, Lawrence W; Greenhalgh, Trish; Macaulay, Ann C

    2014-06-01

    Realist review has increased in popularity as a methodology for complex intervention assessment. Our experience suggests that the process of designing a realist review requires its customization to areas under investigation. To elaborate on this idea, we first describe the logic underpinning realist review and then present critical reflections on our application experience, organized in seven areas. These are the following: (1) the challenge of identifying middle range theory; (2) addressing heterogeneity and lack of conceptual clarity; (3) the challenge of appraising the quality of complex evidence; (4) the relevance of capturing unintended outcomes; (5) understanding the process of context, mechanism, and outcome (CMO) configuring; (6) incorporating middle-range theory in the CMO configuration process; and (7) using middle range theory to advance the conceptualization of outcomes - both visible and seemingly 'hidden'. One conclusion from our experience is that the degree of heterogeneity of the evidence base will determine whether theory can drive the development of review protocols from the outset, or will follow only after an intense period of data immersion. We hope that presenting a critical reflection on customizing realist review will convey how the methodology can be tailored to the often complex and idiosyncratic features of health research, leading to innovative evidence syntheses. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Integrated teaching program using case-based learning

    PubMed Central

    Bhardwaj, Pankaj; Bhardwaj, Nikha; Mahdi, Farzana; Srivastava, J P; Gupta, Uma

    2015-01-01

    Background: At present, in a medical school, students are taught in different departments, subject-wise, without integration to interrelate or unify subjects and these results in compartmentalization of medical education, with no stress on case-based learning. Therefore, an effort was made to develop and adopt integrated teaching in order to have a better contextual knowledge among students. Methodology and Implementation: After the faculty orientation training, four “topic committees” with faculty members from different departments were constituted which decided and agreed on the content material to be taught, different methodologies to be used, along with the logical sequencing of the same for the purpose of implementation. Different teaching methodologies used, during the program, were didactic lectures, case stimulated sessions, clinical visits, laboratory work, and small group student's seminar. Results: After the implementation of program, the comparison between two batches as well as between topics taught with integrated learning program versus traditional method showed that students performed better in the topics, taught with integrated approach. Students rated “clinical visits” as very good methodology, followed by “case stimulated interactive sessions.” Students believed that they felt more actively involved, and their queries are better addressed with such interactive sessions. Conclusion: There is a very good perception of students toward integrated teaching. Students performed better if they are taught using this technique. Although majority of faculty found integrated teaching, as useful method of teaching, nevertheless extra work burden and interdepartmental coordination remained a challenging task. PMID:26380204

  14. Reconfigurable Optical Directed-Logic Circuits

    DTIC Science & Technology

    2015-11-20

    AFRL-AFOSR-VA-TR-2016-0053 Reconfigurable Optical Directed-Logic Circuits Jacob Robinson WILLIAM MARSH RICE UNIV HOUSTON TX Final Report 11/20/2015...2015 Reconfigurable Optical Directed-Logic Circuits FA9550-12-1-0261 FA9550-12-1-0261 Robinson, Jacob Rice University 6100 Main Street Houston...Optical Directed-Logic Circuits Jacob T. Robinson and Qianfan Xu Rice University 1. Motivation for Directed-Logic Circuits Directed-logic is

  15. Rule-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.

  16. Does your model weigh the same as a Duck?

    NASA Astrophysics Data System (ADS)

    Jain, Ajay N.; Cleves, Ann E.

    2012-01-01

    Computer-aided drug design is a mature field by some measures, and it has produced notable successes that underpin the study of interactions between small molecules and living systems. However, unlike a truly mature field, fallacies of logic lie at the heart of the arguments in support of major lines of research on methodology and validation thereof. Two particularly pernicious ones are cum hoc ergo propter hoc (with this, therefore because of this) and confirmation bias (seeking evidence that is confirmatory of the hypothesis at hand). These fallacies will be discussed in the context of off-target predictive modeling, QSAR, molecular similarity computations, and docking. Examples will be shown that avoid these problems.

  17. Fundamentals handbook of electrical and computer engineering. Volume 1 Circuits fields and electronics

    NASA Astrophysics Data System (ADS)

    Chang, S. S. L.

    State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.

  18. More inclusive workplaces: fact or fiction? The case of Norway.

    PubMed

    Olsen, T; Svendal, A; Amundsen, I

    2005-10-01

    Over the past years focus on different workfare programmes in Norway has increased in order to meet a number of identified problems in the labour market. The Tripartite Agreement on a More Inclusive Workplace of October 2001 is one of the measures introduced to create a more inclusive workplace, reduce the utilization of disability benefits and sick leave, and retain senior employees longer. Recommended methodology is improved employee-employer dialogue and increased focus on what the employee can do (workability). This paper is a critical review of why the Agreement has, so far, not lived up to the expectations. We also question whether the logic of the Tripartite Agreement is the solution to the problem.

  19. X-Phi and Carnapian Explication.

    PubMed

    Shepherd, Joshua; Justus, James

    2015-04-01

    The rise of experimental philosophy (x-phi) has placed metaphilosophical questions, particularly those concerning concepts, at the center of philosophical attention. X-phi offers empirically rigorous methods for identifying conceptual content, but what exactly it contributes towards evaluating conceptual content remains unclear. We show how x-phi complements Rudolf Carnap's underappreciated methodology for concept determination, explication. This clarifies and extends x-phi's positive philosophical import, and also exhibits explication's broad appeal. But there is a potential problem: Carnap's account of explication was limited to empirical and logical concepts, but many concepts of interest to philosophers (experimental and otherwise) are essentially normative. With formal epistemology as a case study, we show how x-phi assisted explication can apply to normative domains.

  20. Atomic evidence that modification of H-bonds established with amino acids critical for host-cell binding induces sterile immunity against malaria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patarroyo, Manuel E., E-mail: mepatarr@mail.com; Universidad Nacional de Colombia, Bogota; Cifuentes, Gladys

    Based on the 3D X-ray crystallographic structures of relevant proteins of the malaria parasite involved in invasion to host cells and 3D NMR structures of High Activity Binding Peptides (HABPs) and their respective analogues, it was found that HABPs are rendered into highly immunogenic and sterile immunity inducers in the Aotus experimental model by modifying those amino acids that establish H-bonds with other HABPs or binding to host's cells. This finding adds striking and novel physicochemical principles, at the atomic level, for a logical and rational vaccine development methodology against infectious disease, among them malaria.

  1. Phenomenology and management of cognitive and behavioral disorders in Parkinson's disease. Rise and logic of dementia in Parkinson's disease.

    PubMed

    Potagas, Constantin; Papageorgiou, Sokratis

    2006-08-08

    An overview of studies on the issue of dementia in Parkinson's disease shows that, over time, there has been an evolution in the perception of the magnitude of the problem and of its nature. Dementia seems today to be part of the disease. This change in the understanding of the disease can be accounted for by various methodological problems and by difficulties, on one hand, in the definition of dementia and its differentiation from other conditions, and, on the other hand, in the diagnosis of the disease itself in individual cases. Optimal therapeutic strategies are also examined, either based on cholinesterase inhibitors or antiparkinsonian drugs and symptomatic measures.

  2. Theorem Proving in Intel Hardware Design

    NASA Technical Reports Server (NTRS)

    O'Leary, John

    2009-01-01

    For the past decade, a framework combining model checking (symbolic trajectory evaluation) and higher-order logic theorem proving has been in production use at Intel. Our tools and methodology have been used to formally verify execution cluster functionality (including floating-point operations) for a number of Intel products, including the Pentium(Registered TradeMark)4 and Core(TradeMark)i7 processors. Hardware verification in 2009 is much more challenging than it was in 1999 - today s CPU chip designs contain many processor cores and significant firmware content. This talk will attempt to distill the lessons learned over the past ten years, discuss how they apply to today s problems, outline some future directions.

  3. Semantic Modelling of Digital Forensic Evidence

    NASA Astrophysics Data System (ADS)

    Kahvedžić, Damir; Kechadi, Tahar

    The reporting of digital investigation results are traditionally carried out in prose and in a large investigation may require successive communication of findings between different parties. Popular forensic suites aid in the reporting process by storing provenance and positional data but do not automatically encode why the evidence is considered important. In this paper we introduce an evidence management methodology to encode the semantic information of evidence. A structured vocabulary of terms, ontology, is used to model the results in a logical and predefined manner. The descriptions are application independent and automatically organised. The encoded descriptions aim to help the investigation in the task of report writing and evidence communication and can be used in addition to existing evidence management techniques.

  4. Critical Review of Hamby's (2014) Article Titled "Intimate Partner and Sexual Violence Research, Scientific Progress, Scientific Challenges, and Gender".

    PubMed

    Winstok, Zeev

    2015-07-28

    In a recent article, Hamby advocates the replacement of the "old" Conflict Tactic Scales used to measure physical partner violence (PV) with a new measurement instrument that represents and supports a thesis that gender use of physical PV is asymmetrical rather than symmetrical. This article takes a critical look at the logic, assumptions, arguments, examples, interpretations, and conclusions, presented in Hamby's article, and in some cases disagrees with them. Furthermore, this article uses Hamby's proposals as an opportunity to review and examine core issues in the study of perpetration of physical PV, including gender-related theoretical and methodological issues. © The Author(s) 2015.

  5. The Proposal Concept of Development and Implementation in Strategy of Sustainable Corporate Social Responsibility in the Context of the HCS Model 3E

    NASA Astrophysics Data System (ADS)

    Sakál, Peter; Hrdinová, Gabriela

    2016-06-01

    This article is the result of a conceptual design methodology for the development of a sustainable strategy of sustainable corporate social responsibility (SCSR) in the context of the HCS model 3E formed, as a co-author within the stated grants and dissertation. On the basis of the use of propositional logic, the SCSR procedure is proposed for incorporation into the corporate strategy of sustainable development and the integrated management system (IMS) of the industrial enterprise. The aim of this article is the proposal of the concept of development and implementation strategy of SCSR in the context of the HCS model 3E.

  6. A molybdenum disulfide/carbon nanotube heterogeneous complementary inverter.

    PubMed

    Huang, Jun; Somu, Sivasubramanian; Busnaina, Ahmed

    2012-08-24

    We report a simple, bottom-up/top-down approach for integrating drastically different nanoscale building blocks to form a heterogeneous complementary inverter circuit based on layered molybdenum disulfide and carbon nanotube (CNT) bundles. The fabricated CNT/MoS(2) inverter is composed of n-type molybdenum disulfide (MOS(2)) and p-type CNT transistors, with a high voltage gain of 1.3. The CNT channels are fabricated using directed assembly while the layered molybdenum disulfide channels are fabricated by mechanical exfoliation. This bottom-up fabrication approach for integrating various nanoscale elements with unique characteristics provides an alternative cost-effective methodology to complementary metal-oxide-semiconductors, laying the foundation for the realization of high performance logic circuits.

  7. Specifying real-time systems with interval logic

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    Pure temporal logic makes no reference to time. An interval temporal logic and an extension to that logic which includes real time constraints are described. The application of this logic by giving a specification for the well-known lift (elevator) example is demonstrated. It is shown how interval logic can be extended to include a notion of process. How the specification language and verification environment of EHDM could be enhanced to support this logic is described. A specification of the alternating bit protocol in this extended version of the specification language of EHDM is given.

  8. Low delay and area efficient soft error correction in arbitration logic

    DOEpatents

    Sugawara, Yutaka

    2013-09-10

    There is provided an arbitration logic device for controlling an access to a shared resource. The arbitration logic device comprises at least one storage element, a winner selection logic device, and an error detection logic device. The storage element stores a plurality of requestors' information. The winner selection logic device selects a winner requestor among the requestors based on the requestors' information received from a plurality of requestors. The winner selection logic device selects the winner requestor without checking whether there is the soft error in the winner requestor's information.

  9. Spintronic logic: from switching devices to computing systems

    NASA Astrophysics Data System (ADS)

    Friedman, Joseph S.

    2017-09-01

    Though numerous spintronic switching devices have been proposed or demonstrated, there has been significant difficulty in translating these advances into practical computing systems. The challenge of cascading has impeded the integration of multiple devices into a logic family, and several proposed solutions potentially overcome these challenges. Here, the cascading techniques by which the output of each spintronic device can drive the input of another device are described for several logic families, including spin-diode logic (in particular, all-carbon spin logic), complementary magnetic tunnel junction logic (CMAT), and emitter-coupled spin-transistor logic (ECSTL).

  10. Enzymatic AND logic gates operated under conditions characteristic of biomedical applications.

    PubMed

    Melnikov, Dmitriy; Strack, Guinevere; Zhou, Jian; Windmiller, Joshua Ray; Halámek, Jan; Bocharova, Vera; Chuang, Min-Chieh; Santhosh, Padmanabhan; Privman, Vladimir; Wang, Joseph; Katz, Evgeny

    2010-09-23

    Experimental and theoretical analyses of the lactate dehydrogenase and glutathione reductase based enzymatic AND logic gates in which the enzymes and their substrates serve as logic inputs are performed. These two systems are examples of the novel, previously unexplored class of biochemical logic gates that illustrate potential biomedical applications of biochemical logic. They are characterized by input concentrations at logic 0 and 1 states corresponding to normal and pathophysiological conditions. Our analysis shows that the logic gates under investigation have similar noise characteristics. Both significantly amplify random noise present in inputs; however, we establish that for realistic widths of the input noise distributions, it is still possible to differentiate between the logic 0 and 1 states of the output. This indicates that reliable detection of pathophysiological conditions is indeed possible with such enzyme logic systems.

  11. Fuzzy logic controller optimization

    DOEpatents

    Sepe, Jr., Raymond B; Miller, John Michael

    2004-03-23

    A method is provided for optimizing a rotating induction machine system fuzzy logic controller. The fuzzy logic controller has at least one input and at least one output. Each input accepts a machine system operating parameter. Each output produces at least one machine system control parameter. The fuzzy logic controller generates each output based on at least one input and on fuzzy logic decision parameters. Optimization begins by obtaining a set of data relating each control parameter to at least one operating parameter for each machine operating region. A model is constructed for each machine operating region based on the machine operating region data obtained. The fuzzy logic controller is simulated with at least one created model in a feedback loop from a fuzzy logic output to a fuzzy logic input. Fuzzy logic decision parameters are optimized based on the simulation.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadowski, Greg

    In one form, a logic circuit includes an asynchronous logic circuit, a synchronous logic circuit, and an interface circuit coupled between the asynchronous logic circuit and the synchronous logic circuit. The asynchronous logic circuit has a plurality of asynchronous outputs for providing a corresponding plurality of asynchronous signals. The synchronous logic circuit has a plurality of synchronous inputs corresponding to the plurality of asynchronous outputs, a stretch input for receiving a stretch signal, and a clock output for providing a clock signal. The synchronous logic circuit provides the clock signal as a periodic signal but prolongs a predetermined state ofmore » the clock signal while the stretch signal is active. The asynchronous interface detects whether metastability could occur when latching any of the plurality of the asynchronous outputs of the asynchronous logic circuit using said clock signal, and activates the stretch signal while the metastability could occur.« less

  13. Exploring the institutional logics of health professions education scholarship units.

    PubMed

    Varpio, Lara; O'Brien, Bridget; Hu, Wendy; Ten Cate, Olle; Durning, Steven J; van der Vleuten, Cees; Gruppen, Larry; Irby, David; Humphrey-Murto, Susan; Hamstra, Stanley J

    2017-07-01

    Although health professions education scholarship units (HPESUs) share a commitment to the production and dissemination of rigorous educational practices and research, they are situated in many different contexts and have a wide range of structures and functions. In this study, the authors explore the institutional logics common across HPESUs, and how these logics influence the organisation and activities of HPESUs. The authors analysed interviews with HPESU leaders in Canada (n = 12), Australia (n = 21), New Zealand (n = 3) and the USA (n = 11). Using an iterative process, they engaged in inductive and deductive analyses to identify institutional logics across all participating HPESUs. They explored the contextual factors that influence how these institutional logics impact each HPESU's structure and function. Participants identified three institutional logics influencing the organisational structure and functions of an HPESU: (i) the logic of financial accountability; (ii) the logic of a cohesive education continuum, and (iii) the logic of academic research, service and teaching. Although most HPESUs embodied all three logics, the power of the logics varied among units. The relative power of each logic influenced leaders' decisions about how members of the unit allocate their time, and what kinds of scholarly contribution and product are valued by the HPESU. Identifying the configuration of these three logics within and across HPESUs provides insights into the reasons why individual units are structured and function in particular ways. Having a common language in which to discuss these logics can enhance transparency, facilitate evaluation, and help leaders select appropriate indicators of HPESU success. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  14. The Temporal Logic of the Tower Chief System

    NASA Technical Reports Server (NTRS)

    Hazelton, Lyman R., Jr.

    1990-01-01

    The purpose is to describe the logic used in the reasoning scheme employed in the Tower Chief system, a runway configuration management system. First, a review of classical logic is given. Defensible logics, truth maintenance, default logic, temporally dependent propositions, and resource allocation and planning are discussed.

  15. Digital design using selection operations

    NASA Technical Reports Server (NTRS)

    Miles, Lowell H. (Inventor); Whitaker, Sterling R. (Inventor); Cameron, Eric G. (Inventor)

    2004-01-01

    A digital integrated circuit chip is designed by identifying a logical structure to be implemented. This logical structure is represented in terms of a logical operations, at least 5% of which include selection operations. A determination is made of logic cells that correspond to an implementation of these logical operations.

  16. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  17. How artificial intelligence tools can be used to assess individual patient risk in cardiovascular disease: problems with the current methods.

    PubMed

    Grossi, Enzo

    2006-05-03

    In recent years a number of algorithms for cardiovascular risk assessment has been proposed to the medical community. These algorithms consider a number of variables and express their results as the percentage risk of developing a major fatal or non-fatal cardiovascular event in the following 10 to 20 years The author has identified three major pitfalls of these algorithms, linked to the limitation of the classical statistical approach in dealing with this kind of non linear and complex information. The pitfalls are the inability to capture the disease complexity, the inability to capture process dynamics, and the wide confidence interval of individual risk assessment. Artificial Intelligence tools can provide potential advantage in trying to overcome these limitations. The theoretical background and some application examples related to artificial neural networks and fuzzy logic have been reviewed and discussed. The use of predictive algorithms to assess individual absolute risk of cardiovascular future events is currently hampered by methodological and mathematical flaws. The use of newer approaches, such as fuzzy logic and artificial neural networks, linked to artificial intelligence, seems to better address both the challenge of increasing complexity resulting from a correlation between predisposing factors, data on the occurrence of cardiovascular events, and the prediction of future events on an individual level.

  18. Logic gate scanner focus control in high-volume manufacturing using scatterometry

    NASA Astrophysics Data System (ADS)

    Dare, Richard J.; Swain, Bryan; Laughery, Michael

    2004-05-01

    Tool matching and optimal process control are critical requirements for success in semiconductor manufacturing. It is imperative that a tool"s operating conditions are understood and controlled in order to create a process that is repeatable and produces devices within specifications. Likewise, it is important where possible to match multiple systems using some methodology, so that regardless of which tool is used the process remains in control. Agere Systems is currently using Timbre Technologies" Optical Digital Profilometry (ODP) scatterometry for controlling Nikon scanner focus at the most critical lithography layer; logic gate. By adjusting focus settings and verifying the resultant changes in resist profile shape using ODP, it becomes possible to actively control scanner focus to achieve a desired resist profile. Since many critical lithography processes are designed to produce slightly re-entrant resist profiles, this type of focus control is not possible via Critical Dimension Scanning Electron Microscopy (CDSEM) where reentrant profiles cannot be accurately determined. Additionally, the high throughput and non-destructive nature of this measurement technique saves both cycle time and wafer costs compared to cross-section SEM. By implementing an ODP daily process check and after any maintenance on a scanner, Agere successfully enabled focus drift control, i.e. making necessary focus or equipment changes in order to maintain a desired resist profile.

  19. A compact physical model for the simulation of pNML-based architectures

    NASA Astrophysics Data System (ADS)

    Turvani, G.; Riente, F.; Plozner, E.; Schmitt-Landsiedel, D.; Breitkreutz-v. Gamm, S.

    2017-05-01

    Among emerging technologies, perpendicular Nanomagnetic Logic (pNML) seems to be very promising because of its capability of combining logic and memory onto the same device, scalability, 3D-integration and low power consumption. Recently, Full Adder (FA) structures clocked by a global magnetic field have been experimentally demonstrated and detailed characterizations of the switching process governing the domain wall (DW) nucleation probability Pnuc and time tnuc have been performed. However, the design of pNML architectures represent a crucial point in the study of this technology; this can have a remarkable impact on the reliability of pNML structures. Here, we present a compact model developed in VHDL which enables to simulate complex pNML architectures while keeping into account critical physical parameters. Therefore, such parameters have been extracted from the experiments, fitted by the corresponding physical equations and encapsulated into the proposed model. Within this, magnetic structures are decomposed into a few basic elements (nucleation centers, nanowires, inverters etc.) represented by the according physical description. To validate the model, we redesigned a FA and compared our simulation results to the experiment. With this compact model of pNML devices we have envisioned a new methodology which makes it possible to simulate and test the physical behavior of complex architectures with very low computational costs.

  20. Fuzzy Traffic Control with Vehicle-to-Everything Communication.

    PubMed

    Salman, Muntaser A; Ozdemir, Suat; Celebi, Fatih V

    2018-01-27

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR.

  1. Fuzzy Traffic Control with Vehicle-to-Everything Communication

    PubMed Central

    Ozdemir, Suat; Celebi, Fatih V.

    2018-01-01

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR. PMID:29382053

  2. Integrating GIS with AHP and Fuzzy Logic to generate hand, foot and mouth disease hazard zonation (HFMD-HZ) model in Thailand

    NASA Astrophysics Data System (ADS)

    Samphutthanon, R.; Tripathi, N. K.; Ninsawat, S.; Duboz, R.

    2014-12-01

    The main objective of this research was the development of an HFMD hazard zonation (HFMD-HZ) model by applying AHP and Fuzzy Logic AHP methodologies for weighting each spatial factor such as disease incidence, socio-economic and physical factors. The outputs of AHP and FAHP were input into a Geographic Information Systems (GIS) process for spatial analysis. 14 criteria were selected for analysis as important factors: disease incidence over 10 years from 2003 to 2012, population density, road density, land use and physical features. The results showed a consistency ratio (CR) value for these main criteria of 0.075427 for AHP, the CR for FAHP results was 0.092436. As both remained below the threshold of 0.1, the CR value were acceptable. After linking to actual geospatial data (disease incidence 2013) through spatial analysis by GIS for validation, the results of the FAHP approach were found to match more accurately than those of the AHP approach. The zones with the highest hazard of HFMD outbreaks were located in two main areas in central Muang Chiang Mai district including suburbs and Muang Chiang Rai district including the vicinity. The produced hazardous maps may be useful for organizing HFMD protection plans.

  3. [Chapter 7. Big Data or the illusion of a synthesis by aggregation. Epistemological, ethical and political critics].

    PubMed

    Coutellec, Léo; Weil-Dubuc, Paul-Loup

    2017-10-27

    In this article, we propose a critical approach to the big data phenomenon by deconstructing the methodological principle that structures its logic : the principle of aggregation. Our hypothesis is upstream of the critics who make the use of big data a new mode of government. Aggregation, as a mode of processing the heterogeneity of data, structures the thinking big data, it is its very logic. Fragmentation in order to better aggregate, to aggregate to better fragment, a dialectic based on a presumption of generalized aggregability and on the claim to make aggregation the preferred route for the production of new syntheses. We proceed in three steps to deconstruct this idea and undo the claim of aggregation to assert itself as a new way to produce knowledge, as a new synthesis of identity and finally as a new model of solidarity. Each time we show that these attempts at aggregation fail to produce their objects : no knowledge, no identity, no solidarity can result from a process of amalgamation. In all three cases, aggregation is always accompanied by a moment of fragmentation whose dissociation, dislocation and separation are different figures. The bet we are making then is to make hesitate what presents itself as a new way of thinking man and the world.

  4. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation

    PubMed Central

    Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar

    2015-01-01

    Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766

  5. Cochrane Qualitative and Implementation Methods Group guidance series-paper 4: methods for assessing evidence on intervention implementation.

    PubMed

    Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane

    2018-05-01

    This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation.

    PubMed

    Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar

    2015-12-26

    Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.

  7. Cascading of molecular logic gates for advanced functions: a self-reporting, activatable photosensitizer.

    PubMed

    Erbas-Cakmak, Sundus; Akkaya, Engin U

    2013-10-18

    Logical progress: Independent molecular logic gates have been designed and characterized. Then, the individual molecular logic gates were coerced to work together within a micelle. Information relay between the two logic gates was achieved through the intermediacy of singlet oxygen. Working together, these concatenated logic gates result in a self-reporting and activatable photosensitizer. GSH=glutathione. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Construction of a fuzzy and Boolean logic gates based on DNA.

    PubMed

    Zadegan, Reza M; Jepsen, Mette D E; Hildebrandt, Lasse L; Birkedal, Victoria; Kjems, Jørgen

    2015-04-17

    Logic gates are devices that can perform logical operations by transforming a set of inputs into a predictable single detectable output. The hybridization properties, structure, and function of nucleic acids can be used to make DNA-based logic gates. These devices are important modules in molecular computing and biosensing. The ideal logic gate system should provide a wide selection of logical operations, and be integrable in multiple copies into more complex structures. Here we show the successful construction of a small DNA-based logic gate complex that produces fluorescent outputs corresponding to the operation of the six Boolean logic gates AND, NAND, OR, NOR, XOR, and XNOR. The logic gate complex is shown to work also when implemented in a three-dimensional DNA origami box structure, where it controlled the position of the lid in a closed or open position. Implementation of multiple microRNA sensitive DNA locks on one DNA origami box structure enabled fuzzy logical operation that allows biosensing of complex molecular signals. Integrating logic gates with DNA origami systems opens a vast avenue to applications in the fields of nanomedicine for diagnostics and therapeutics. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Cooperation Among Theorem Provers

    NASA Technical Reports Server (NTRS)

    Waldinger, Richard J.

    1998-01-01

    This is a final report, which supports NASA's PECSEE (Persistent Cognizant Software Engineering Environment) effort and complements the Kestrel Institute project "Inference System Integration via Logic Morphism". The ultimate purpose of the project is to develop a superior logical inference mechanism by combining the diverse abilities of multiple cooperating theorem provers. In many years of research, a number of powerful theorem-proving systems have arisen with differing capabilities and strengths. Resolution theorem provers (such as Kestrel's KITP or SRI's, SNARK) deal with first-order logic with equality but not the principle of mathematical induction. The Boyer-Moore theorem prover excels at proof by induction but cannot deal with full first-order logic. Both are highly automated but cannot accept user guidance easily. The PVS system (from SRI) in only automatic within decidable theories, but it has well-designed interactive capabilities: furthermore, it includes higher-order logic, not just first-order logic. The NuPRL system from Cornell University and the STeP system from Stanford University have facilities for constructive logic and temporal logic, respectively - both are interactive. It is often suggested - for example, in the anonymous "QED Manifesto"-that we should pool the resources of all these theorem provers into a single system, so that the strengths of one can compensate for the weaknesses of others, and so that effort will not be duplicated. However, there is no straightforward way of doing this, because each system relies on its own language and logic for its success. Thus. SNARK uses ordinary first-order logic with equality, PVS uses higher-order logic. and NuPRL uses constructive logic. The purpose of this project, and the companion project at Kestrel, has been to use the category-theoretic notion of logic morphism to combine systems with different logics and languages. Kestrel's SPECWARE system has been the vehicle for the implementation.

  10. EAGLE can do Efficient LTL Monitoring

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We briefly present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. In this paper we show how EAGLE can do linear temporal logic (LTL) monitoring in an efficient way. We give an upper bound on the space and time complexity of this monitoring.

  11. Nonmonotonic Logic for Use in Information Retrieval: An Exploratory Paper.

    ERIC Educational Resources Information Center

    Hurt, C. D.

    1998-01-01

    Monotonic logic requires reexamination of the entire logic string when there is a contradiction. Nonmonotonic logic allows the user to withdraw conclusions in the face of contradiction without harm to the logic string, which has considerable application to the field of information searching. Artificial intelligence models and neural networks based…

  12. Fuzzy Logic Engine

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna

    2005-01-01

    The Fuzzy Logic Engine is a software package that enables users to embed fuzzy-logic modules into their application programs. Fuzzy logic is useful as a means of formulating human expert knowledge and translating it into software to solve problems. Fuzzy logic provides flexibility for modeling relationships between input and output information and is distinguished by its robustness with respect to noise and variations in system parameters. In addition, linguistic fuzzy sets and conditional statements allow systems to make decisions based on imprecise and incomplete information. The user of the Fuzzy Logic Engine need not be an expert in fuzzy logic: it suffices to have a basic understanding of how linguistic rules can be applied to the user's problem. The Fuzzy Logic Engine is divided into two modules: (1) a graphical-interface software tool for creating linguistic fuzzy sets and conditional statements and (2) a fuzzy-logic software library for embedding fuzzy processing capability into current application programs. The graphical- interface tool was developed using the Tcl/Tk programming language. The fuzzy-logic software library was written in the C programming language.

  13. Formal Logic and Flowchart for Diagnosis Validity Verification and Inclusion in Clinical Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Sosa, M.; Grundel, L.; Simini, F.

    2016-04-01

    Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.

  14. The development of a healing model of care for an Indigenous drug and alcohol residential rehabilitation service: a community-based participatory research approach.

    PubMed

    Munro, Alice; Shakeshaft, Anthony; Clifford, Anton

    2017-12-04

    Given the well-established evidence of disproportionately high rates of substance-related morbidity and mortality after release from incarceration for Indigenous Australians, access to comprehensive, effective and culturally safe residential rehabilitation treatment will likely assist in reducing recidivism to both prison and substance dependence for this population. In the absence of methodologically rigorous evidence, the delivery of Indigenous drug and alcohol residential rehabilitation services vary widely, and divergent views exist regarding the appropriateness and efficacy of different potential treatment components. One way to increase the methodological quality of evaluations of Indigenous residential rehabilitation services is to develop partnerships with researchers to better align models of care with the client's, and the community's, needs. An emerging research paradigm to guide the development of high quality evidence through a number of sequential steps that equitably involves services, stakeholders and researchers is community-based participatory research (CBPR). The purpose of this study is to articulate an Indigenous drug and alcohol residential rehabilitation service model of care, developed in collaboration between clients, service providers and researchers using a CBPR approach. This research adopted a mixed methods CBPR approach to triangulate collected data to inform the development of a model of care for a remote Indigenous drug and alcohol residential rehabilitation service. Four iterative CBPR steps of research activity were recorded during the 3-year research partnership. As a direct outcome of the CBPR framework, the service and researchers co-designed a Healing Model of Care that comprises six core treatment components, three core organisational components and is articulated in two program logics. The program logics were designed to specifically align each component and outcome with the mechanism of change for the client or organisation to improve data collection and program evaluation. The description of the CBPR process and the Healing Model of Care provides one possible solution about how to provide better care for the large and growing population of Indigenous people with substance.

  15. Accelerating a MPEG-4 video decoder through custom software/hardware co-design

    NASA Astrophysics Data System (ADS)

    Díaz, Jorge L.; Barreto, Dacil; García, Luz; Marrero, Gustavo; Carballo, Pedro P.; Núñez, Antonio

    2007-05-01

    In this paper we present a novel methodology to accelerate an MPEG-4 video decoder using software/hardware co-design for wireless DAB/DMB networks. Software support includes the services provided by the embedded kernel μC/OS-II, and the application tasks mapped to software. Hardware support includes several custom co-processors and a communication architecture with bridges to the main system bus and with a dual port SRAM. Synchronization among tasks is achieved at two levels, by a hardware protocol and by kernel level scheduling services. Our reference application is an MPEG-4 video decoder composed of several software functions and written using a special C++ library named CASSE. Profiling and space exploration techniques were used previously over the Advanced Simple Profile (ASP) MPEG-4 decoder to determinate the best HW/SW partition developed here. This research is part of the ARTEMI project and its main goal is the establishment of methodologies for the design of real-time complex digital systems using Programmable Logic Devices with embedded microprocessors as target technology and the design of multimedia systems for broadcasting networks as reference application.

  16. Deriving Laws from Ordering Relations

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    The effect of Richard T. Cox's contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox s technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics.

  17. Methodological Issues in Questionnaire Design.

    PubMed

    Song, Youngshin; Son, Youn Jung; Oh, Doonam

    2015-06-01

    The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.

  18. Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space

    NASA Astrophysics Data System (ADS)

    Christakos, G.

    We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.

  19. The Histochemistry and Cell Biology omnium-gatherum: the year 2015 in review.

    PubMed

    Taatjes, Douglas J; Roth, Jürgen

    2016-03-01

    We provide here our annual review/synopsis of all of the articles published in Histochemistry and Cell Biology (HCB) for the preceding year. In 2015, HCB published 102 articles, representing a wide variety of topics and methodologies. For ease of access to these differing topics, we have created categories, as determined by the types of articles presented to provide a quick index representing the general areas covered. This year, these categories include: (1) advances in methodologies; (2) molecules in health and disease; (3) organelles, subcellular structures, and compartments; (4) the nucleus; (5) stem cells and tissue engineering; (6) cell cultures: properties and capabilities; (7) connective tissues and extracellular matrix; (8) developmental biology; (9) nervous system; (10) musculoskeletal system; (11) respiratory and cardiovascular system; (12) liver and gastrointestinal tract; and (13) male and female reproductive systems. Of note, the categories proceed from methods development, to molecules, intracellular compartments, stem cells and cell culture, extracellular matrix, developmental biology, and finishing with various organ systems, hopefully presenting a logical journey from methods to organismal molecules, cells, and whole tissue systems.

  20. Model-Based Control of a Nonlinear Aircraft Engine Simulation using an Optimal Tuner Kalman Filter Approach

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Csank, Jeffrey Thomas; Chicatelli, Amy; Kilver, Jacob

    2013-01-01

    This paper covers the development of a model-based engine control (MBEC) methodology featuring a self tuning on-board model applied to an aircraft turbofan engine simulation. Here, the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) serves as the MBEC application engine. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC over a wide range of operating points. The on-board model is a piece-wise linear model derived from CMAPSS40k and updated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. Investigations using the MBEC to provide a stall margin limit for the controller protection logic are presented that could provide benefits over a simple acceleration schedule that is currently used in traditional engine control architectures.

  1. CO2 storage resources, reserves, and reserve growth: Toward a methodology for integrated assessment of the storage capacity of oil and gas reservoirs and saline formations

    USGS Publications Warehouse

    Burruss, Robert

    2009-01-01

    Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible.

  2. CO2 storage resources, reserves, and reserve growth: Toward a methodology for integrated assessment of the storage capacity of oil and gas reservoirs and saline formations

    USGS Publications Warehouse

    Burruss, R.C.

    2009-01-01

    Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible. ?? 2009 Elsevier Ltd. All rights reserved.

  3. Silicon compilation: From the circuit to the system

    NASA Astrophysics Data System (ADS)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  4. On the validity of language: speaking, knowing and understanding in medical geography.

    PubMed

    Scarpaci, J L

    1993-09-01

    This essay examines methodological problems concerning the conceptualization and operationalization of phenomena central to medical geography. Its main argument is that qualitative research can be strengthened if the differences between instrumental and apparent validity are better understood than the current research in medical geography suggests. Its premise is that our definitions of key terms and concepts must be reinforced throughout the design of research should our knowledge and understanding be enhanced. In doing so, the paper aims to move the methodological debate beyond the simple dichotomies of quantitative vs qualitative approaches and logical positivism vs phenomenology. Instead, the argument is couched in a postmodernist hermeneutic sense which questions the validity of one discourse of investigation over another. The paper begins by discussing methods used in conceptualizing and operationalizing variables in quantitative and qualitative research design. Examples derive from concepts central to a geography of health-care behavior and well-being. The latter half of the essay shows the uses and misuses of validity studies in selected health services research and the current debate on national health insurance.

  5. [Ethnographic approaches to research and intervention in mental health].

    PubMed

    Nunes, Mônica de Oliveira; de Torrenté, Maurice

    2013-10-01

    The specifics of ethnographic approaches to mental health research are examined, highlighting the motives why the type of knowledge produced by ethnography is relevant to the context of Psychiatric Reform and the biomedicalization of existence. The discussion is focused on interpretation-based ethnography in the field of mental health, stressing the theoretical and methodological foundations of a comprehensive form of apprehending the scope of mental health as an object akin to a clinic of the individual. The centrality of social and cultural aspects in the ethnographic approach and the inflexions mediated by the type of ethnographic methodological undertaking is stressed. Lastly, the ethnography of madness is seen as a fitting example that substantiates some of these characteristics. The contention is that accessing psychotic persons (and others who may speak about these experiences) from varied areas of their daily life, situated in their various social inscriptions, while confronting these interpretations with other interpretative dimensions of their social reality and within the logic linked to local psychologies, is a pertinent procedure, from whence certain aspects of an understanding of madness (or causes of its incomprehension) can emerge.

  6. Clinical practice guideline development manual: a quality-driven approach for translating evidence into action.

    PubMed

    Rosenfeld, Richard M; Shiffman, Richard N

    2009-06-01

    Guidelines translate best evidence into best practice. A well-crafted guideline promotes quality by reducing health-care variations, improving diagnostic accuracy, promoting effective therapy, and discouraging ineffective-or potentially harmful-interventions. Despite a plethora of published guidelines, methodology is often poorly defined and varies greatly within and among organizations. This manual describes the principles and practices used successfully by the American Academy of Otolaryngology-Head and Neck Surgery to produce quality-driven, evidence-based guidelines using efficient and transparent methodology for action-ready recommendations with multidisciplinary applicability. The development process, which allows moving from conception to completion in 12 months, emphasizes a logical sequence of key action statements supported by amplifying text, evidence profiles, and recommendation grades that link action to evidence. As clinical practice guidelines become more prominent as a key metric of quality health care, organizations must develop efficient production strategies that balance rigor and pragmatism. Equally important, clinicians must become savvy in understanding what guidelines are-and are not-and how they are best utilized to improve care. The information in this manual should help clinicians and organizations achieve these goals.

  7. Reprogrammable Logic Gate and Logic Circuit Based on Multistimuli-Responsive Raspberry-like Micromotors.

    PubMed

    Zhang, Lina; Zhang, Hui; Liu, Mei; Dong, Bin

    2016-06-22

    In this paper, we report a polymer-based raspberry-like micromotor. Interestingly, the resulting micromotor exhibits multistimuli-responsive motion behavior. Its on-off-on motion can be regulated by the application of stimuli such as H2O2, near-infrared light, NH3, or their combinations. Because of the versatility in motion control, the current micromotor has great potential in the application field of logic gate and logic circuit. With use of different stimuli as the inputs and the micromotor motion as the output, reprogrammable OR and INHIBIT logic gates or logic circuit consisting of OR, NOT, and AND logic gates can be achieved.

  8. Noise-Aided Logic in an Electronic Analog of Synthetic Genetic Networks

    PubMed Central

    Hellen, Edward H.; Dana, Syamal K.; Kurths, Jürgen; Kehler, Elizabeth; Sinha, Sudeshna

    2013-01-01

    We report the experimental verification of noise-enhanced logic behaviour in an electronic analog of a synthetic genetic network, composed of two repressors and two constitutive promoters. We observe good agreement between circuit measurements and numerical prediction, with the circuit allowing for robust logic operations in an optimal window of noise. Namely, the input-output characteristics of a logic gate is reproduced faithfully under moderate noise, which is a manifestation of the phenomenon known as Logical Stochastic Resonance. The two dynamical variables in the system yield complementary logic behaviour simultaneously. The system is easily morphed from AND/NAND to OR/NOR logic. PMID:24124531

  9. Quantum design rules for single molecule logic gates.

    PubMed

    Renaud, N; Hliwa, M; Joachim, C

    2011-08-28

    Recent publications have demonstrated how to implement a NOR logic gate with a single molecule using its interaction with two surface atoms as logical inputs [W. Soe et al., ACS Nano, 2011, 5, 1436]. We demonstrate here how this NOR logic gate belongs to the general family of quantum logic gates where the Boolean truth table results from a full control of the quantum trajectory of the electron transfer process through the molecule by very local and classical inputs practiced on the molecule. A new molecule OR gate is proposed for the logical inputs to be also single metal atoms, one per logical input.

  10. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Reversible logic elements as a new field of application of optical solitons

    NASA Astrophysics Data System (ADS)

    Maimistov, Andrei I.

    1995-10-01

    An analysis is made of the fundamental concepts of conservative logic. It is shown that the existing optical soliton switches can be converted into logic gates which act as conservative logic elements. A logic device of this type, based on a nonlinear fibre-optic directional coupler, is considered. Polarised solitons are used in this coupler. This use of solitons leads in a natural way to the desirability of developing conservative triple-valued logic.

  11. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    PubMed

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  12. DEMONSTRATION BULLETIN: THE ECO LOGIC THERMAL DESORPTION UNIT - MIDDLEGROUND LANDFILL - BAY CITY, MI - ELI ECO LOGIC INTERNATIONAL, INC.

    EPA Science Inventory

    ECO Logic has developed a thermal desorption unit 0"DU) for the treatment of soils contaminated with hazardous organic contaminants. This TDU has been designed to be used in conjunction with Eco Logic's patented gas-phase chemical reduction reactor. The Eco Logic reactor is the s...

  13. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  14. Identifying a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Kristan D.; Faraj, Daniel A.

    In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, bymore » the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.« less

  15. Design of a Ferroelectric Programmable Logic Gate Array

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat Duen

    2003-01-01

    A programmable logic gate array has been designed utilizing ferroelectric field effect transistors. The design has only a small number of gates, but this could be scaled up to a more useful size. Using FFET's in a logic array gives several advantages. First, it allows real-time programmability to the array to give high speed reconfiguration. It also allows the array to be configured nearly an unlimited number of times, unlike a FLASH FPGA. Finally, the Ferroelectric Programmable Logic Gate Array (FPLGA) can be implemented using a smaller number of transistors because of the inherent logic characteristics of an FFET. The device was only designed and modeled using Spice models of the circuit, including the FFET. The actual device was not produced. The design consists of a small array of NAND and NOR logic gates. Other gates could easily be produced. They are linked by FFET's that control the logic flow. Timing and logic tables have been produced showing the array can produce a variety of logic combinations at a real time usable speed. This device could be a prototype for a device that could be put into imbedded systems that need the high speed of hardware implementation of logic and the complexity to need to change the logic algorithm. Because of the non-volatile nature of the FFET, it would also be useful in situations that needed to program a logic array once and use it repeatedly after the power has been shut off.

  16. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.

  17. Nonvolatile “AND,” “OR,” and “NOT” Boolean logic gates based on phase-change memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y.; Zhong, Y. P.; Deng, Y. F.

    2013-12-21

    Electronic devices or circuits that can implement both logic and memory functions are regarded as the building blocks for future massive parallel computing beyond von Neumann architecture. Here we proposed phase-change memory (PCM)-based nonvolatile logic gates capable of AND, OR, and NOT Boolean logic operations verified in SPICE simulations and circuit experiments. The logic operations are parallel computing and results can be stored directly in the states of the logic gates, facilitating the combination of computing and memory in the same circuit. These results are encouraging for ultralow-power and high-speed nonvolatile logic circuit design based on novel memory devices.

  18. Multi-input and binary reproducible, high bandwidth floating point adder in a collective network

    DOEpatents

    Chen, Dong; Eisley, Noel A.; Heidelberger, Philip; Steinmacher-Burow, Burkhard

    2016-11-15

    To add floating point numbers in a parallel computing system, a collective logic device receives the floating point numbers from computing nodes. The collective logic devices converts the floating point numbers to integer numbers. The collective logic device adds the integer numbers and generating a summation of the integer numbers. The collective logic device converts the summation to a floating point number. The collective logic device performs the receiving, the converting the floating point numbers, the adding, the generating and the converting the summation in one pass. One pass indicates that the computing nodes send inputs only once to the collective logic device and receive outputs only once from the collective logic device.

  19. Automating Access Control Logics in Simple Type Theory with LEO-II

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.

  20. Boolean Logic Tree of Label-Free Dual-Signal Electrochemical Aptasensor System for Biosensing, Three-State Logic Computation, and Keypad Lock Security Operation.

    PubMed

    Lu, Jiao Yang; Zhang, Xin Xing; Huang, Wei Tao; Zhu, Qiu Yan; Ding, Xue Zhi; Xia, Li Qiu; Luo, Hong Qun; Li, Nian Bing

    2017-09-19

    The most serious and yet unsolved problems of molecular logic computing consist in how to connect molecular events in complex systems into a usable device with specific functions and how to selectively control branchy logic processes from the cascading logic systems. This report demonstrates that a Boolean logic tree is utilized to organize and connect "plug and play" chemical events DNA, nanomaterials, organic dye, biomolecule, and denaturant for developing the dual-signal electrochemical evolution aptasensor system with good resettability for amplification detection of thrombin, controllable and selectable three-state logic computation, and keypad lock security operation. The aptasensor system combines the merits of DNA-functionalized nanoamplification architecture and simple dual-signal electroactive dye brilliant cresyl blue for sensitive and selective detection of thrombin with a wide linear response range of 0.02-100 nM and a detection limit of 1.92 pM. By using these aforementioned chemical events as inputs and the differential pulse voltammetry current changes at different voltages as dual outputs, a resettable three-input biomolecular keypad lock based on sequential logic is established. Moreover, the first example of controllable and selectable three-state molecular logic computation with active-high and active-low logic functions can be implemented and allows the output ports to assume a high impediment or nothing (Z) state in addition to the 0 and 1 logic levels, effectively controlling subsequent branchy logic computation processes. Our approach is helpful in developing the advanced controllable and selectable logic computing and sensing system in large-scale integration circuits for application in biomedical engineering, intelligent sensing, and control.

  1. Nonvolatile reconfigurable sequential logic in a HfO2 resistive random access memory array.

    PubMed

    Zhou, Ya-Xiong; Li, Yi; Su, Yu-Ting; Wang, Zhuo-Rui; Shih, Ling-Yi; Chang, Ting-Chang; Chang, Kuan-Chang; Long, Shi-Bing; Sze, Simon M; Miao, Xiang-Shui

    2017-05-25

    Resistive random access memory (RRAM) based reconfigurable logic provides a temporal programmable dimension to realize Boolean logic functions and is regarded as a promising route to build non-von Neumann computing architecture. In this work, a reconfigurable operation method is proposed to perform nonvolatile sequential logic in a HfO 2 -based RRAM array. Eight kinds of Boolean logic functions can be implemented within the same hardware fabrics. During the logic computing processes, the RRAM devices in an array are flexibly configured in a bipolar or complementary structure. The validity was demonstrated by experimentally implemented NAND and XOR logic functions and a theoretically designed 1-bit full adder. With the trade-off between temporal and spatial computing complexity, our method makes better use of limited computing resources, thus provides an attractive scheme for the construction of logic-in-memory systems.

  2. Gate-Controlled BP-WSe2 Heterojunction Diode for Logic Rectifiers and Logic Optoelectronics.

    PubMed

    Li, Dong; Wang, Biao; Chen, Mingyuan; Zhou, Jun; Zhang, Zengxing

    2017-06-01

    p-n junctions play an important role in modern semiconductor electronics and optoelectronics, and field-effect transistors are often used for logic circuits. Here, gate-controlled logic rectifiers and logic optoelectronic devices based on stacked black phosphorus (BP) and tungsten diselenide (WSe 2 ) heterojunctions are reported. The gate-tunable ambipolar charge carriers in BP and WSe 2 enable a flexible, dynamic, and wide modulation on the heterojunctions as isotype (p-p and n-n) and anisotype (p-n) diodes, which exhibit disparate rectifying and photovoltaic properties. Based on such characteristics, it is demonstrated that BP-WSe 2 heterojunction diodes can be developed for high-performance logic rectifiers and logic optoelectronic devices. Logic optoelectronic devices can convert a light signal to an electric one by applied gate voltages. This work should be helpful to expand the applications of 2D crystals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Interconnect-free parallel logic circuits in a single mechanical resonator

    PubMed Central

    Mahboob, I.; Flurin, E.; Nishiguchi, K.; Fujiwara, A.; Yamaguchi, H.

    2011-01-01

    In conventional computers, wiring between transistors is required to enable the execution of Boolean logic functions. This has resulted in processors in which billions of transistors are physically interconnected, which limits integration densities, gives rise to huge power consumption and restricts processing speeds. A method to eliminate wiring amongst transistors by condensing Boolean logic into a single active element is thus highly desirable. Here, we demonstrate a novel logic architecture using only a single electromechanical parametric resonator into which multiple channels of binary information are encoded as mechanical oscillations at different frequencies. The parametric resonator can mix these channels, resulting in new mechanical oscillation states that enable the construction of AND, OR and XOR logic gates as well as multibit logic circuits. Moreover, the mechanical logic gates and circuits can be executed simultaneously, giving rise to the prospect of a parallel logic processor in just a single mechanical resonator. PMID:21326230

  4. Interconnect-free parallel logic circuits in a single mechanical resonator.

    PubMed

    Mahboob, I; Flurin, E; Nishiguchi, K; Fujiwara, A; Yamaguchi, H

    2011-02-15

    In conventional computers, wiring between transistors is required to enable the execution of Boolean logic functions. This has resulted in processors in which billions of transistors are physically interconnected, which limits integration densities, gives rise to huge power consumption and restricts processing speeds. A method to eliminate wiring amongst transistors by condensing Boolean logic into a single active element is thus highly desirable. Here, we demonstrate a novel logic architecture using only a single electromechanical parametric resonator into which multiple channels of binary information are encoded as mechanical oscillations at different frequencies. The parametric resonator can mix these channels, resulting in new mechanical oscillation states that enable the construction of AND, OR and XOR logic gates as well as multibit logic circuits. Moreover, the mechanical logic gates and circuits can be executed simultaneously, giving rise to the prospect of a parallel logic processor in just a single mechanical resonator.

  5. Tyramine Hydrochloride Based Label-Free System for Operating Various DNA Logic Gates and a DNA Caliper for Base Number Measurements.

    PubMed

    Fan, Daoqing; Zhu, Xiaoqing; Dong, Shaojun; Wang, Erkang

    2017-07-05

    DNA is believed to be a promising candidate for molecular logic computation, and the fluorogenic/colorimetric substrates of G-quadruplex DNAzyme (G4zyme) are broadly used as label-free output reporters of DNA logic circuits. Herein, for the first time, tyramine-HCl (a fluorogenic substrate of G4zyme) is applied to DNA logic computation and a series of label-free DNA-input logic gates, including elementary AND, OR, and INHIBIT logic gates, as well as a two to one encoder, are constructed. Furthermore, a DNA caliper that can measure the base number of target DNA as low as three bases is also fabricated. This DNA caliper can also perform concatenated AND-AND logic computation to fulfil the requirements of sophisticated logic computing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Constructing a logical, regular axis topology from an irregular topology

    DOEpatents

    Faraj, Daniel A.

    2014-07-22

    Constructing a logical regular topology from an irregular topology including, for each axial dimension and recursively, for each compute node in a subcommunicator until returning to a first node: adding to a logical line of the axial dimension a neighbor specified in a nearest neighbor list; calling the added compute node; determining, by the called node, whether any neighbor in the node's nearest neighbor list is available to add to the logical line; if a neighbor in the called compute node's nearest neighbor list is available to add to the logical line, adding, by the called compute node to the logical line, any neighbor in the called compute node's nearest neighbor list for the axial dimension not already added to the logical line; and, if no neighbor in the called compute node's nearest neighbor list is available to add to the logical line, returning to the calling compute node.

  7. Constructing a logical, regular axis topology from an irregular topology

    DOEpatents

    Faraj, Daniel A.

    2014-07-01

    Constructing a logical regular topology from an irregular topology including, for each axial dimension and recursively, for each compute node in a subcommunicator until returning to a first node: adding to a logical line of the axial dimension a neighbor specified in a nearest neighbor list; calling the added compute node; determining, by the called node, whether any neighbor in the node's nearest neighbor list is available to add to the logical line; if a neighbor in the called compute node's nearest neighbor list is available to add to the logical line, adding, by the called compute node to the logical line, any neighbor in the called compute node's nearest neighbor list for the axial dimension not already added to the logical line; and, if no neighbor in the called compute node's nearest neighbor list is available to add to the logical line, returning to the calling compute node.

  8. Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes

    NASA Astrophysics Data System (ADS)

    Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan

    2012-05-01

    The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.

  9. Fuzzy branching temporal logic.

    PubMed

    Moon, Seong-ick; Lee, Kwang H; Lee, Doheon

    2004-04-01

    Intelligent systems require a systematic way to represent and handle temporal information containing uncertainty. In particular, a logical framework is needed that can represent uncertain temporal information and its relationships with logical formulae. Fuzzy linear temporal logic (FLTL), a generalization of propositional linear temporal logic (PLTL) with fuzzy temporal events and fuzzy temporal states defined on a linear time model, was previously proposed for this purpose. However, many systems are best represented by branching time models in which each state can have more than one possible future path. In this paper, fuzzy branching temporal logic (FBTL) is proposed to address this problem. FBTL adopts and generalizes concurrent tree logic (CTL*), which is a classical branching temporal logic. The temporal model of FBTL is capable of representing fuzzy temporal events and fuzzy temporal states, and the order relation among them is represented as a directed graph. The utility of FBTL is demonstrated using a fuzzy job shop scheduling problem as an example.

  10. Magnetic tunnel junction based spintronic logic devices

    NASA Astrophysics Data System (ADS)

    Lyle, Andrew Paul

    The International Technology Roadmap for Semiconductors (ITRS) predicts that complimentary metal oxide semiconductor (CMOS) based technologies will hit their last generation on or near the 16 nm node, which we expect to reach by the year 2025. Thus future advances in computational power will not be realized from ever-shrinking device sizes, but rather by 'outside the box' designs and new physics, including molecular or DNA based computation, organics, magnonics, or spintronic. This dissertation investigates magnetic logic devices for post-CMOS computation. Three different architectures were studied, each relying on a different magnetic mechanism to compute logic functions. Each design has it benefits and challenges that must be overcome. This dissertation focuses on pushing each design from the drawing board to a realistic logic technology. The first logic architecture is based on electrically connected magnetic tunnel junctions (MTJs) that allow direct communication between elements without intermediate sensing amplifiers. Two and three input logic gates, which consist of two and three MTJs connected in parallel, respectively were fabricated and are compared. The direct communication is realized by electrically connecting the output in series with the input and applying voltage across the series connections. The logic gates rely on the fact that a change in resistance at the input modulates the voltage that is needed to supply the critical current for spin transfer torque switching the output. The change in resistance at the input resulted in a voltage margin of 50--200 mV and 250--300 mV for the closest input states for the three and two input designs, respectively. The two input logic gate realizes the AND, NAND, NOR, and OR logic functions. The three input logic function realizes the Majority, AND, NAND, NOR, and OR logic operations. The second logic architecture utilizes magnetostatically coupled nanomagnets to compute logic functions, which is the basis of Magnetic Quantum Cellular Automata (MQCA). MQCA has the potential to be thousands of times more energy efficient than CMOS technology. While interesting, these systems are academic unless they can be interfaced into current technologies. This dissertation pushed past a major hurdle by experimentally demonstrating a spintronic input/output (I/O) interface for the magnetostatically coupled nanomagnets by incorporating MTJs. This spintronic interface allows individual nanomagnets to be programmed using spin transfer torque and read using magneto resistance structure. Additionally the spintronic interface allows statistical data on the reliability of the magnetic coupling utilized for data propagation to be easily measured. The integration of spintronics and MQCA for an electrical interface to achieve a magnetic logic device with low power creates a competitive post-CMOS logic device. The final logic architecture that was studied used MTJs to compute logic functions and magnetic domain walls to communicate between gates. Simulations were used to optimize the design of this architecture. Spin transfer torque was used to compute logic function at each MTJ gate and was used to drive the domain walls. The design demonstrated that multiple nanochannels could be connected to each MTJ to realize fan-out from the logic gates. As a result this logic scheme eliminates the need for intermediate reads and conversions to pass information from one logic gate to another.

  11. A Classification of Designated Logic Systems

    DTIC Science & Technology

    1988-02-01

    Introduction to Logic. New York: Macmillan Publishing, 1972. Grimaldi , Ralph P . Discrete and Combinatorial Mathematics. Reading: Addison-Wesley...sound basis for understanding non-classical logic systems. I would like to thank the Air Force Institute of Technology for funding this research. vi p ...6 p ILLUSTRATIONS Figure Page 1. Two Classifications of Designated Logic Systems 13 2. Two Partitions of Two-valued Logic Systems 14 3. Two

  12. Counter Unmanned Aerial System Decision-Aid Logic Process (C-UAS DALP)

    DTIC Science & Technology

    decision -aid or logic process that bridges the middle elements of the kill... of use, location, general logic process , and reference mission. This is the framework for the IDEF0 functional architecture diagrams, decision -aid diagrams, logic process , and modeling and simulation....chain between detection to countermeasure response. This capstone project creates the logic for a decision process that transitions from the

  13. A Conditional Criterion for Identity, Leading to a Fourth Law of Logic

    DTIC Science & Technology

    1979-06-01

    Identify by block number) Aristotle, Aristotlean logic, axiom, axioms of logic, change, Charles Muses, chronotopology, collapse of the wave function...of perception, merely accounting for the spatial aspects. In other words, Aristotlean logic is a synthesis of primitive observation, which has been...parameter, not an observable. Hence measurement/detection (observ- ables)deal with primitive observation and Aristotlean logic (topology), while total

  14. Fuzzy Logic and Education: Teaching the Basics of Fuzzy Logic through an Example (By Way of Cycling)

    ERIC Educational Resources Information Center

    Sobrino, Alejandro

    2013-01-01

    Fuzzy logic dates back to 1965 and it is related not only to current areas of knowledge, such as Control Theory and Computer Science, but also to traditional ones, such as Philosophy and Linguistics. Like any logic, fuzzy logic is concerned with argumentation, but unlike other modalities, which focus on the crisp reasoning of Mathematics, it deals…

  15. Intellectual technologies in the problems of thermal power engineering control: formalization of fuzzy information processing results using the artificial intelligence methodology

    NASA Astrophysics Data System (ADS)

    Krokhin, G.; Pestunov, A.

    2017-11-01

    Exploitation conditions of power stations in variable modes and related changes of their technical state actualized problems of creating models for decision-making and state recognition basing on diagnostics using the fuzzy logic for identification their state and managing recovering processes. There is no unified methodological approach for obtaining the relevant information is a case of fuzziness and inhomogeneity of the raw information about the equipment state. The existing methods for extracting knowledge are usually unable to provide the correspondence between of the aggregates model parameters and the actual object state. The switchover of the power engineering from the preventive repair to the one, which is implemented according to the actual technical state, increased the responsibility of those who estimate the volume and the duration of the work. It may lead to inadequacy of the diagnostics and the decision-making models if corresponding methodological preparations do not take fuzziness into account, because the nature of the state information is of this kind. In this paper, we introduce a new model which formalizes the equipment state using not only exact information, but fuzzy as well. This model is more adequate to the actual state, than traditional analogs, and may be used in order to increase the efficiency and the service period of the power installations.

  16. Toward an Application Guide for Safety Integrity Level Allocation in Railway Systems.

    PubMed

    Ouedraogo, Kiswendsida Abel; Beugin, Julie; El-Koursi, El-Miloudi; Clarhaut, Joffrey; Renaux, Dominique; Lisiecki, Frederic

    2018-02-02

    The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety-related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety-related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety-related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of "emergency brake" subsystems. Some specific SIL allocation rules are also defined and illustrated. © 2018 Society for Risk Analysis.

  17. Multi-input and binary reproducible, high bandwidth floating point adder in a collective network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Dong; Eisley, Noel A; Heidelberger, Philip

    To add floating point numbers in a parallel computing system, a collective logic device receives the floating point numbers from computing nodes. The collective logic devices converts the floating point numbers to integer numbers. The collective logic device adds the integer numbers and generating a summation of the integer numbers. The collective logic device converts the summation to a floating point number. The collective logic device performs the receiving, the converting the floating point numbers, the adding, the generating and the converting the summation in one pass. One pass indicates that the computing nodes send inputs only once to themore » collective logic device and receive outputs only once from the collective logic device.« less

  18. Feasible logic Bell-state analysis with linear optics

    PubMed Central

    Zhou, Lan; Sheng, Yu-Bo

    2016-01-01

    We describe a feasible logic Bell-state analysis protocol by employing the logic entanglement to be the robust concatenated Greenberger-Horne-Zeilinger (C-GHZ) state. This protocol only uses polarization beam splitters and half-wave plates, which are available in current experimental technology. We can conveniently identify two of the logic Bell states. This protocol can be easily generalized to the arbitrary C-GHZ state analysis. We can also distinguish two N-logic-qubit C-GHZ states. As the previous theory and experiment both showed that the C-GHZ state has the robustness feature, this logic Bell-state analysis and C-GHZ state analysis may be essential for linear-optical quantum computation protocols whose building blocks are logic-qubit entangled state. PMID:26877208

  19. Feasible logic Bell-state analysis with linear optics.

    PubMed

    Zhou, Lan; Sheng, Yu-Bo

    2016-02-15

    We describe a feasible logic Bell-state analysis protocol by employing the logic entanglement to be the robust concatenated Greenberger-Horne-Zeilinger (C-GHZ) state. This protocol only uses polarization beam splitters and half-wave plates, which are available in current experimental technology. We can conveniently identify two of the logic Bell states. This protocol can be easily generalized to the arbitrary C-GHZ state analysis. We can also distinguish two N-logic-qubit C-GHZ states. As the previous theory and experiment both showed that the C-GHZ state has the robustness feature, this logic Bell-state analysis and C-GHZ state analysis may be essential for linear-optical quantum computation protocols whose building blocks are logic-qubit entangled state.

  20. Integrating geological archives and climate models for the mid-Pliocene warm period.

    PubMed

    Haywood, Alan M; Dowsett, Harry J; Dolan, Aisling M

    2016-02-16

    The mid-Pliocene Warm Period (mPWP) offers an opportunity to understand a warmer-than-present world and assess the predictive ability of numerical climate models. Environmental reconstruction and climate modelling are crucial for understanding the mPWP, and the synergy of these two, often disparate, fields has proven essential in confirming features of the past and in turn building confidence in projections of the future. The continual development of methodologies to better facilitate environmental synthesis and data/model comparison is essential, with recent work demonstrating that time-specific (time-slice) syntheses represent the next logical step in exploring climate change during the mPWP and realizing its potential as a test bed for understanding future climate change.

  1. Analysis of individual risk belief structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonn, B.E.; Travis, C.B.; Arrowood, L.

    An interactive computer program developed at Oak Ridge National Laboratory is presented as a methodology to model individualized belief structures. The logic and general strategy of the model is presented for two risk topics: AIDs and toxic waste. Subjects identified desirable and undesirable consequences for each topic and formulated an associative rule linking topic and consequence in either a causal or correlational framework. Likelihood estimates, generated by subjects in several formats (probability, odds statements, etc.), constituted one outcome measure. Additionally, source of belief (personal experience, news media, etc.) and perceived personal and societal impact are reviewed. Briefly, subjects believe thatmore » AIDs causes significant emotional problems, and to a lesser degree, physical health problems whereas toxic waste causes significant environmental problems.« less

  2. Current Trend Towards Using Soft Computing Approaches to Phase Synchronization in Communication Systems

    NASA Technical Reports Server (NTRS)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    1999-01-01

    This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.

  3. Materials Compatibility Testing in RSRM ODC: Free Cleaner Selection

    NASA Technical Reports Server (NTRS)

    Keen, Jill M.; Sagers, Neil W.; McCool, Alex (Technical Monitor)

    2001-01-01

    Government regulations have mandated production phase-outs of a number of solvents, including 1,1,1-trichloroethane, an ozone-depleting chemical (ODC). This solvent was used extensively in the production of the Reusable Solid Rocket Motors (RSRMs) for the Space Shuttle. Many tests have been performed to identify replacement cleaners. One major area of concern in the selection of a new cleaner has been compatibility. Some specific areas considered included cleaner compatibility with non-metallic surfaces, painted surfaces, support materials such as gloves and wipers as well as corrosive properties of the cleaners on the alloys used on these motors. The intent of this paper is to summarize the test logic, methodology, and results acquired from testing the many cleaner and material combinations.

  4. Applying reliability analysis to design electric power systems for More-electric aircraft

    NASA Astrophysics Data System (ADS)

    Zhang, Baozhu

    The More-Electric Aircraft (MEA) is a type of aircraft that replaces conventional hydraulic and pneumatic systems with electrically powered components. These changes have significantly challenged the aircraft electric power system design. This thesis investigates how reliability analysis can be applied to automatically generate system topologies for the MEA electric power system. We first use a traditional method of reliability block diagrams to analyze the reliability level on different system topologies. We next propose a new methodology in which system topologies, constrained by a set reliability level, are automatically generated. The path-set method is used for analysis. Finally, we interface these sets of system topologies with control synthesis tools to automatically create correct-by-construction control logic for the electric power system.

  5. Reply [to “Comment on ‘The Zen of Venn’” by Priestley Toulmin

    NASA Astrophysics Data System (ADS)

    Berkman, Paul Arthur

    While Venn diagrams, “strictly speaking,” may not have been designed for the “peritechnical literature” they certainly provide a symbolic framework for integrating concepts beyond the context of “mathematically defined objects.” It is interesting that Toulmin was offended and compelled to protest the application of Venn diagrams that are not bound by his “valid methodology.” Such disciplinary constraints on creativity appear contrary to the original writings of John Venn who esteemed interdisciplinary approaches and argued fiercely against those who objected to his introducing mathematical symbols into logic [Venn, 1894]. “Symbolic Logic” itself was crafted with a view toward a general utility “in the solution of complicated problems” [Venn, 1894].

  6. Proceedings of the Seventh International Symposium on Methodologies for Intelligent Systems (Poster Session)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harber, K.S.

    1993-05-01

    This report contains the following papers: Implications in vivid logic; a self-learning bayesian expert system; a natural language generation system for a heterogeneous distributed database system; competence-switching'' managed by intelligent systems; strategy acquisition by an artificial neural network: Experiments in learning to play a stochastic game; viewpoints and selective inheritance in object-oriented modeling; multivariate discretization of continuous attributes for machine learning; utilization of the case-based reasoning method to resolve dynamic problems; formalization of an ontology of ceramic science in CLASSIC; linguistic tools for intelligent systems; an application of rough sets in knowledge synthesis; and a relational model for imprecise queries.more » These papers have been indexed separately.« less

  7. Proceedings of the Seventh International Symposium on Methodologies for Intelligent Systems (Poster Session)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harber, K.S.

    1993-05-01

    This report contains the following papers: Implications in vivid logic; a self-learning Bayesian Expert System; a natural language generation system for a heterogeneous distributed database system; ``competence-switching`` managed by intelligent systems; strategy acquisition by an artificial neural network: Experiments in learning to play a stochastic game; viewpoints and selective inheritance in object-oriented modeling; multivariate discretization of continuous attributes for machine learning; utilization of the case-based reasoning method to resolve dynamic problems; formalization of an ontology of ceramic science in CLASSIC; linguistic tools for intelligent systems; an application of rough sets in knowledge synthesis; and a relational model for imprecise queries.more » These papers have been indexed separately.« less

  8. Arterial signal timing optimization using PASSER II-87

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, E.C.P.; Messer, C.J.; Garza, R.U.

    1988-11-01

    PASSER is the acronym for the Progression Analysis and Signal System Evaluation Routine. PASSER II was originally developed by the Texas Transportation Institute (TTI) for the Dallas Corridor Project. The Texas State Department of Highways and Public Transportation (SDHPT) has sponsored the subsequent program development on both mainframe computers and microcomputers. The theory, model structure, methodology, and logic of PASSER II have been evaluated and well documented. PASSER II is widely used because of its ability to easily select multiple-phase sequences by adjusting the background cycle length and progression speeds to find the optimal timing plants, such as cycle, greenmore » split, phase sequence, and offsets, that can efficiently maximize the two-way progression bands.« less

  9. Enhancing the Scope of the Diels-Alder Reaction through Isonitrile Chemistry: Emergence of a New Class of Acyl-Activated Dienophiles

    PubMed Central

    Townsend, Steven D.; Wu, Xiangyang; Danishefsky, Samuel J.

    2012-01-01

    α,β-Unsaturated imides, formylated at the nitrogen atom, comprise a new and valuable family of dienophiles for servicing Diels-Alder reactions. These systems are assembled through extension of recently discovered isonitrile chemistry to the domain of α,β-unsaturated acids. Cycloadditions are facilitated by Et2AlCl, presumably via chelation between the two carbonyl groups of the N-formyl amide. Applications of the isonitrile/Diels-Alder logic to the IMDA reaction, as well as methodologies to modify the N-formyl amide of the resultant cycloaddition product, are described. It is expected that this easily executed chemistry will provide a significant enhancement for application of Diels-Alder reactions to many synthetic targets. PMID:22708980

  10. Waste certification program plan for Oak Ridge National Laboratory. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrin, R.C.

    1997-05-01

    This document defines the waste certification program developed for implementation at Oak Ridge National Laboratory (ORNL). The document describes the program structure, logic, and methodology for certification of ORNL wastes. The purpose of the waste certification program is to provide assurance that wastes are properly characterized and that the Waste Acceptance Criteria (WAC) for receiving facilities are met. The program meets the waste certification requirements outlined in US Department of Energy (DOE) Order 5820.2A, Radioactive Waste Management, and ensures that 40 CFR documentation requirements for waste characterization are met for mixed (both radioactive and hazardous) and hazardous (including polychlorinated biphenyls)more » waste. Program activities will be conducted according to ORNL Level 1 document requirements.« less

  11. [John Snow, the cholera epidemic and the foundation of modern epidemiology].

    PubMed

    Cerda L, Jaime; Valdivia C, Gonzalo

    2007-08-01

    John Snow (1813-1858) was a brilliant British physician. Since young he stood out for his acute observation capacity, logical thinking and perseverance, first in anesthetics and later in epidemiology. The successive outbreaks of cholera that affected London, motivated him to study this disease from a populational point of view. He related the appearance of cases to the consumption of "morbid matter", responsible for the acute diarrhea with dehydration that characterizes this disease. Bravely, Snow opposed to certain theories present at his time, sacrificing his own prestige. He was a pioneer in the use of modern epidemiological investigation methodologies such as conducting surveys and spatial epidemiology. Fairly, he is considered nowadays as father of modern epidemiology by the scientific community.

  12. Multi Response Optimization of Laser Micro Marking Process:A Grey- Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Shivakoti, I.; Das, P. P.; Kibria, G.; Pradhan, B. B.; Mustafa, Z.; Ghadai, R. K.

    2017-07-01

    The selection of optimal parametric combination for efficient machining has always become a challenging issue for the manufacturing researcher. The optimal parametric combination always provides a better machining which improves the productivity, product quality and subsequently reduces the production cost and time. The paper presents the hybrid approach of Grey relational analysis and Fuzzy logic to obtain the optimal parametric combination for better laser beam micro marking on the Gallium Nitride (GaN) work material. The response surface methodology has been implemented for design of experiment considering three parameters with their five levels. The parameter such as current, frequency and scanning speed has been considered and the mark width, mark depth and mark intensity has been considered as the process response.

  13. The ethical relevance of the unconscious.

    PubMed

    Farisco, Michele; Evers, Kathinka

    2017-12-29

    Ethical analyses of disorders of consciousness traditionally focus on residual awareness. Going one step further, this paper explores the potential ethical relevance of the unawareness retained by patients with disorders of consciousness, focusing specifically on the ethical implications of the description of the unconscious provided by recent scientific research. A conceptual methodology is used, based on the review and analysis of relevant scientific literature on the unconscious and the logical argumentation in favour of the ethical conclusions. Two conditions (experiential wellbeing and having interests) that are generally considered critical components in the ethical discussion of patients with disorders of consciousness might arguably be both conscious and unconscious. The unconscious, as well as consciousness, should be taken into account in the ethical discussions of patients with disorders of consciousness.

  14. Integrating geological archives and climate models for the mid-Pliocene warm period

    PubMed Central

    Haywood, Alan M.; Dowsett, Harry J.; Dolan, Aisling M.

    2016-01-01

    The mid-Pliocene Warm Period (mPWP) offers an opportunity to understand a warmer-than-present world and assess the predictive ability of numerical climate models. Environmental reconstruction and climate modelling are crucial for understanding the mPWP, and the synergy of these two, often disparate, fields has proven essential in confirming features of the past and in turn building confidence in projections of the future. The continual development of methodologies to better facilitate environmental synthesis and data/model comparison is essential, with recent work demonstrating that time-specific (time-slice) syntheses represent the next logical step in exploring climate change during the mPWP and realizing its potential as a test bed for understanding future climate change. PMID:26879640

  15. Otoacoustic Estimates of Cochlear Tuning: Testing Predictions in Macaque

    NASA Astrophysics Data System (ADS)

    Shera, Christopher A.; Bergevin, Christopher; Kalluri, Radha; Mc Laughlin, Myles; Michelet, Pascal; van der Heijden, Marcel; Joris, Philip X.

    2011-11-01

    Otoacoustic estimates of cochlear frequency selectivity suggest substantially sharper tuning in humans. However, the logic and methodology underlying these estimates remain untested by direct measurements in primates. We report measurements of frequency tuning in macaque monkeys, Old-World primates phylogenetically closer to humans than the small laboratory animals often taken as models of human hearing (e.g., cats, guinea pigs, and chinchillas). We find that measurements of tuning obtained directly from individual nerve fibers and indirectly using otoacoustic emissions both indicate that peripheral frequency selectivity in macaques is significantly sharper than in small laboratory animals, matching that inferred for humans at high frequencies. Our results validate the use of otoacoustic emissions for noninvasive measurement of cochlear tuning and corroborate the finding of sharper tuning in humans.

  16. [Development and prospect on skeletal age evaluation methods of X-ray film].

    PubMed

    Wang, Ya-hui; Zhu, Guang-you; Qiao, Ke; Bian, Shi-zhong; Fan, Li-hua; Cheng, Yi-bin; Ying, Chong-liang; Shen, Yan

    2007-10-01

    The traditional methods of skeletal age estimation mainly include Numeration, Atlas, and Counting scores. In recent years, other new methods were proposed by several scholars. Utilizing image logical characteristics of X-ray film to extrapolate skeletal age is a key means by present forensic medicine workers in evaluating skeletal age. However, there exist some variations when we present the conclusion of skeletal age as an "evidence" directly to the Justice Trial Authority. In order to enhance the accuracy of skeletal age determination, further investigation for appropriate methodology should be undertaken. After a collective study of pertinent domestic and international literatures, we present this review of the research and advancement on skeletal age evaluation methods of X-ray film.

  17. X-Phi and Carnapian Explication

    PubMed Central

    Shepherd, Joshua; Justus, James

    2015-01-01

    The rise of experimental philosophy (x-phi) has placed metaphilosophical questions, particularly those concerning concepts, at the center of philosophical attention. X-phi offers empirically rigorous methods for identifying conceptual content, but what exactly it contributes towards evaluating conceptual content remains unclear. We show how x-phi complements Rudolf Carnap’s underappreciated methodology for concept determination, explication. This clarifies and extends x-phi’s positive philosophical import, and also exhibits explication’s broad appeal. But there is a potential problem: Carnap’s account of explication was limited to empirical and logical concepts, but many concepts of interest to philosophers (experimental and otherwise) are essentially normative. With formal epistemology as a case study, we show how x-phi assisted explication can apply to normative domains. PMID:26345713

  18. A critical view of the quest for brain structural markers of Albert Einstein's special talents (a pot of gold under the rainbow).

    PubMed

    Colombo, Jorge A

    2018-06-01

    Assertions regarding attempts to link glial and macrostructural brain events with cognitive performance regarding Albert Einstein, are critically reviewed. One basic problem arises from attempting to draw causal relationships regarding complex, delicately interactive functional processes involving finely tuned molecular and connectivity phenomena expressed in cognitive performance, based on highly variable brain structural events of a single, aged, formalin fixed brain. Data weaknesses and logical flaws are considered. In other instances, similar neuroanatomical observations received different interpretations and conclusions, as those drawn, e.g., from schizophrenic brains. Observations on white matter events also raise methodological queries. Additionally, neurocognitive considerations on other intellectual aptitudes of A. Einstein were simply ignored.

  19. Evidence that logical reasoning depends on conscious processing.

    PubMed

    DeWall, C Nathan; Baumeister, Roy F; Masicampo, E J

    2008-09-01

    Humans, unlike other animals, are equipped with a powerful brain that permits conscious awareness and reflection. A growing trend in psychological science has questioned the benefits of consciousness, however. Testing a hypothesis advanced by [Lieberman, M. D., Gaunt, R., Gilbert, D. T., & Trope, Y. (2002). Reflection and reflexion: A social cognitive neuroscience approach to attributional inference. Advances in Experimental Social Psychology, 34, 199-249], four studies suggested that the conscious, reflective processing system is vital for logical reasoning. Substantial decrements in logical reasoning were found when a cognitive load manipulation preoccupied conscious processing, while hampering the nonconscious system with consciously suppressed thoughts failed to impair reasoning (Experiment 1). Nonconscious activation (priming) of the idea of logical reasoning increased the activation of logic-relevant concepts, but failed to improve logical reasoning performance (Experiments 2a-2c) unless the logical conclusions were largely intuitive and thus not reliant on logical reasoning (Experiment 3). Meanwhile, stimulating the conscious goal of reasoning well led to improvements in reasoning performance (Experiment 4). These findings offer evidence that logical reasoning is aided by the conscious, reflective processing system.

  20. Enzyme-based logic gates and circuits-analytical applications and interfacing with electronics.

    PubMed

    Katz, Evgeny; Poghossian, Arshak; Schöning, Michael J

    2017-01-01

    The paper is an overview of enzyme-based logic gates and their short circuits, with specific examples of Boolean AND and OR gates, and concatenated logic gates composed of multi-step enzyme-biocatalyzed reactions. Noise formation in the biocatalytic reactions and its decrease by adding a "filter" system, converting convex to sigmoid response function, are discussed. Despite the fact that the enzyme-based logic gates are primarily considered as components of future biomolecular computing systems, their biosensing applications are promising for immediate practical use. Analytical use of the enzyme logic systems in biomedical and forensic applications is discussed and exemplified with the logic analysis of biomarkers of various injuries, e.g., liver injury, and with analysis of biomarkers characteristic of different ethnicity found in blood samples on a crime scene. Interfacing of enzyme logic systems with modified electrodes and semiconductor devices is discussed, giving particular attention to the interfaces functionalized with signal-responsive materials. Future perspectives in the design of the biomolecular logic systems and their applications are discussed in the conclusion. Graphical Abstract Various applications and signal-transduction methods are reviewed for enzyme-based logic systems.

  1. An interval logic for higher-level temporal reasoning

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.

    1983-01-01

    Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.

  2. An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem

    NASA Technical Reports Server (NTRS)

    Hosheleva, Olga

    1997-01-01

    How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.

  3. An Institutional Perspective on Accountable Care Organizations.

    PubMed

    Goodrick, Elizabeth; Reay, Trish

    2016-12-01

    We employ aspects of institutional theory to explore how Accountable Care Organizations (ACOs) can effectively manage the multiplicity of ideas and pressures within which they are embedded and consequently better serve patients and their communities. More specifically, we draw on the concept of institutional logics to highlight the importance of understanding the conflicting principles upon which ACOs were founded. Based on previous research conducted both inside and outside health care settings, we argue that ACOs can combine attention to these principles (or institutional logics) in different ways; the options fall on a continuum from (a) segregating the effects of multiple logics from each other by compartmentalizing responses to multiple logics to (b) fully hybridizing the different logics. We suggest that the most productive path for ACOs is to situate their approach between the two extremes of "segregating" and "fully hybridizing." This strategic approach allows ACOs to develop effective responses that combine logics without fully integrating them. We identify three ways that ACOs can embrace institutional complexity short of fully hybridizing disparate logics: (1) reinterpreting practices to make them compatible with other logics; (2) engaging in strategies that take advantage of existing synergy between conflicting logics; (3) creating opportunities for people at frontline to develop innovative ways of working that combine multiple logics. © The Author(s) 2016.

  4. An electrically reconfigurable logic gate intrinsically enabled by spin-orbit materials.

    PubMed

    Kazemi, Mohammad

    2017-11-10

    The spin degree of freedom in magnetic devices has been discussed widely for computing, since it could significantly reduce energy dissipation, might enable beyond Von Neumann computing, and could have applications in quantum computing. For spin-based computing to become widespread, however, energy efficient logic gates comprising as few devices as possible are required. Considerable recent progress has been reported in this area. However, proposals for spin-based logic either require ancillary charge-based devices and circuits in each individual gate or adopt principals underlying charge-based computing by employing ancillary spin-based devices, which largely negates possible advantages. Here, we show that spin-orbit materials possess an intrinsic basis for the execution of logic operations. We present a spin-orbit logic gate that performs a universal logic operation utilizing the minimum possible number of devices, that is, the essential devices required for representing the logic operands. Also, whereas the previous proposals for spin-based logic require extra devices in each individual gate to provide reconfigurability, the proposed gate is 'electrically' reconfigurable at run-time simply by setting the amplitude of the clock pulse applied to the gate. We demonstrate, analytically and numerically with experimentally benchmarked models, that the gate performs logic operations and simultaneously stores the result, realizing the 'stateful' spin-based logic scalable to ultralow energy dissipation.

  5. OncoLogicTM

    EPA Science Inventory

    OncoLogicTM - A Computer System to Evaluate the Carcinogenic Potential of Chemicals
    OncoLogicTM is a software program that evaluates the likelihood that a chemical may cause cancer. OncoLogicTM has been peer reviewed and is being rele...

  6. All optical programmable logic array (PLA)

    NASA Astrophysics Data System (ADS)

    Hiluf, Dawit

    2018-03-01

    A programmable logic array (PLA) is an integrated circuit (IC) logic device that can be reconfigured to implement various kinds of combinational logic circuits. The device has a number of AND and OR gates which are linked together to give output or further combined with more gates or logic circuits. This work presents the realization of PLAs via the physics of a three level system interacting with light. A programmable logic array is designed such that a number of different logical functions can be combined as a sum-of-product or product-of-sum form. We present an all optical PLAs with the aid of laser light and observables of quantum systems, where encoded information can be considered as memory chip. The dynamics of the physical system is investigated using Lie algebra approach.

  7. Coupling induced logical stochastic resonance

    NASA Astrophysics Data System (ADS)

    Aravind, Manaoj; Murali, K.; Sinha, Sudeshna

    2018-06-01

    In this work we will demonstrate the following result: when we have two coupled bistable sub-systems, each driven separately by an external logic input signal, the coupled system yields outputs that can be mapped to specific logic gate operations in a robust manner, in an optimal window of noise. So, though the individual systems receive only one logic input each, due to the interplay of coupling, nonlinearity and noise, they cooperatively respond to give a logic output that is a function of both inputs. Thus the emergent collective response of the system, due to the inherent coupling, in the presence of a noise floor, maps consistently to that of logic outputs of the two inputs, a phenomenon we term coupling induced Logical Stochastic Resonance. Lastly, we demonstrate our idea in proof of principle circuit experiments.

  8. Avoiding Deontic Explosion by Contextually Restricting Aggregation

    NASA Astrophysics Data System (ADS)

    Meheus, Joke; Beirlaen, Mathieu; van de Putte, Frederik

    In this paper, we present an adaptive logic for deontic conflicts, called P2.1 r , that is based on Goble's logic SDL a P e - a bimodal extension of Goble's logic P that invalidates aggregation for all prima facie obligations. The logic P2.1 r has several advantages with respect to SDL a P e. For consistent sets of obligations it yields the same results as Standard Deontic Logic and for inconsistent sets of obligations, it validates aggregation "as much as possible". It thus leads to a richer consequence set than SDL a P e. The logic P2.1 r avoids Goble's criticisms against other non-adjunctive systems of deontic logic. Moreover, it can handle all the 'toy examples' from the literature as well as more complex ones.

  9. THRESHOLD LOGIC SYNTHESIS OF SEQUENTIAL MACHINES.

    DTIC Science & Technology

    The application of threshold logic to the design of sequential machines is the subject of this research. A single layer of threshold logic units in...advantages of fewer components because of the use of threshold logic, along with very high-speed operation resulting from the use of only a single layer of...logic. In some instances, namely for asynchronous machines, the only delay need be the natural delay of the single layer of threshold elements. It is

  10. Repressor logic modules assembled by rolling circle amplification platform to construct a set of logic gates

    PubMed Central

    Wei, Hua; Hu, Bo; Tang, Suming; Zhao, Guojie; Guan, Yifu

    2016-01-01

    Small molecule metabolites and their allosterically regulated repressors play an important role in many gene expression and metabolic disorder processes. These natural sensors, though valuable as good logic switches, have rarely been employed without transcription machinery in cells. Here, two pairs of repressors, which function in opposite ways, were cloned, purified and used to control DNA replication in rolling circle amplification (RCA) in vitro. By using metabolites and repressors as inputs, RCA signals as outputs, four basic logic modules were constructed successfully. To achieve various logic computations based on these basic modules, we designed series and parallel strategies of circular templates, which can further assemble these repressor modules in an RCA platform to realize twelve two-input Boolean logic gates and a three-input logic gate. The RCA-output and RCA-assembled platform was proved to be easy and flexible for complex logic processes and might have application potential in molecular computing and synthetic biology. PMID:27869177

  11. Logic reversibility and thermodynamic irreversibility demonstrated by DNAzyme-based Toffoli and Fredkin logic gates.

    PubMed

    Orbach, Ron; Remacle, Françoise; Levine, R D; Willner, Itamar

    2012-12-26

    The Toffoli and Fredkin gates were suggested as a means to exhibit logic reversibility and thereby reduce energy dissipation associated with logic operations in dense computing circuits. We present a construction of the logically reversible Toffoli and Fredkin gates by implementing a library of predesigned Mg(2+)-dependent DNAzymes and their respective substrates. Although the logical reversibility, for which each set of inputs uniquely correlates to a set of outputs, is demonstrated, the systems manifest thermodynamic irreversibility originating from two quite distinct and nonrelated phenomena. (i) The physical readout of the gates is by fluorescence that depletes the population of the final state of the machine. This irreversible, heat-releasing process is needed for the generation of the output. (ii) The DNAzyme-powered logic gates are made to operate at a finite rate by invoking downhill energy-releasing processes. Even though the three bits of Toffoli's and Fredkin's logically reversible gates manifest thermodynamic irreversibility, we suggest that these gates could have important practical implication in future nanomedicine.

  12. A biochemical logic gate using an enzyme and its inhibitor. Part II: The logic gate.

    PubMed

    Sivan, Sarit; Tuchman, Samuel; Lotan, Noah

    2003-06-01

    Enzyme-Based Logic Gates (ENLOGs) are key components in bio-molecular systems for information processing. This report and the previous one in this series address the characterization of two bio-molecular switching elements, namely the alpha-chymotrypsin (alphaCT) derivative p-phenylazobenzoyl-alpha-chymotrypsin (PABalphaCT) and its inhibitor (proflavine), as well as their assembly into a logic gate. The experimental output of the proposed system is expressed in terms of enzymic activity and this was translated into logic output (i.e. "1" or "0") relative to a predetermined threshold value. We have found that an univalent link exists between the dominant isomers of PABalphaCT (cis or trans), the dominant form of either acridine (proflavine) or acridan and the logic output of the system. Thus, of all possible combinations, only the trans-PABalphaCT and the acridan lead to an enzymic activity that can be defined as logic output "1". The system operates under the rules of Boolean algebra and performs as an "AND" logic gate.

  13. Toward a phenomenology of trance logic in posttraumatic stress disorder.

    PubMed

    Beshai, J A

    2004-04-01

    Some induction procedures result in trance logic as an essential feature of hypnosis. Trance logic is a voluntary state of acceptance of suggestions without the critical evaluation that would destroy the validity of the meaningfulness of the suggestion. Induction procedures in real and simulated conditions induce a conflict between two contradictory messages in experimental hypnosis. In military induction the conflict is much more subtle involving society's need for security and its need for ethics. Such conflicts are often construed by the subject as trance logic. Trance logic provides an opportunity for therapists using the phenomenology of "presence" to deal with the objectified concepts of "avoidance," "numbing" implicit in this kind of dysfunctional thinking in Posttraumatic Stress Disorder. An individual phenomenology of induction procedures and suggestions, which trigger trance logic, may lead to a resolution of logical fallacies and recurring painful memories. It invites a reconciliation of conflicting messages implicit in phobias and avoidance traumas. Such a phenomenological analysis of trance logic may well be a novel approach to restructure the meaning of trauma.

  14. Control of electrochemical signals from quantum dots conjugated to organic materials by using DNA structure in an analog logic gate.

    PubMed

    Chen, Qi; Yoo, Si-Youl; Chung, Yong-Ho; Lee, Ji-Young; Min, Junhong; Choi, Jeong-Woo

    2016-10-01

    Various bio-logic gates have been studied intensively to overcome the rigidity of single-function silicon-based logic devices arising from combinations of various gates. Here, a simple control tool using electrochemical signals from quantum dots (QDs) was constructed using DNA and organic materials for multiple logic functions. The electrochemical redox current generated from QDs was controlled by the DNA structure. DNA structure, in turn, was dependent on the components (organic materials) and the input signal (pH). Independent electrochemical signals from two different logic units containing QDs were merged into a single analog-type logic gate, which was controlled by two inputs. We applied this electrochemical biodevice to a simple logic system and achieved various logic functions from the controlled pH input sets. This could be further improved by choosing QDs, ionic conditions, or DNA sequences. This research provides a feasible method for fabricating an artificial intelligence system. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Method of implementation of optoelectronic multiparametric signal processing systems based on multivalued-logic principles

    NASA Astrophysics Data System (ADS)

    Arestova, M. L.; Bykovskii, A. Yu

    1995-10-01

    An architecture is proposed for a specialised optoelectronic multivalued logic processor based on the Allen—Givone algebra. The processor is intended for multiparametric processing of data arriving from a large number of sensors or for tackling spectral analysis tasks. The processor architecture makes it possible to obtain an approximate general estimate of the state of an object being diagnosed on a p-level scale. Optoelectronic systems are proposed for MAXIMUM, MINIMUM, and LITERAL logic gates, based on optical-frequency encoding of logic levels. Corresponding logic gates form a complete set of logic functions in the Allen—Givone algebra.

  16. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  17. Parallel logic gates in synthetic gene networks induced by non-Gaussian noise.

    PubMed

    Xu, Yong; Jin, Xiaoqin; Zhang, Huiqing

    2013-11-01

    The recent idea of logical stochastic resonance is verified in synthetic gene networks induced by non-Gaussian noise. We realize the switching between two kinds of logic gates under optimal moderate noise intensity by varying two different tunable parameters in a single gene network. Furthermore, in order to obtain more logic operations, thus providing additional information processing capacity, we obtain in a two-dimensional toggle switch model two complementary logic gates and realize the transformation between two logic gates via the methods of changing different parameters. These simulated results contribute to improve the computational power and functionality of the networks.

  18. Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.

  19. Size reduction techniques for vital compliant VHDL simulation models

    DOEpatents

    Rich, Marvin J.; Misra, Ashutosh

    2006-08-01

    A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.

  20. Local rollback for fault-tolerance in parallel computing systems

    DOEpatents

    Blumrich, Matthias A [Yorktown Heights, NY; Chen, Dong [Yorktown Heights, NY; Gara, Alan [Yorktown Heights, NY; Giampapa, Mark E [Yorktown Heights, NY; Heidelberger, Philip [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Steinmacher-Burow, Burkhard [Boeblingen, DE; Sugavanam, Krishnan [Yorktown Heights, NY

    2012-01-24

    A control logic device performs a local rollback in a parallel super computing system. The super computing system includes at least one cache memory device. The control logic device determines a local rollback interval. The control logic device runs at least one instruction in the local rollback interval. The control logic device evaluates whether an unrecoverable condition occurs while running the at least one instruction during the local rollback interval. The control logic device checks whether an error occurs during the local rollback. The control logic device restarts the local rollback interval if the error occurs and the unrecoverable condition does not occur during the local rollback interval.

Top