Teaching Semantic Tableaux Method for Propositional Classical Logic with a CAS
ERIC Educational Resources Information Center
Aguilera-Venegas, Gabriel; Galán-García, José Luis; Galán-García, María Ángeles; Rodríguez-Cielos, Pedro
2015-01-01
Automated theorem proving (ATP) for Propositional Classical Logic is an algorithm to check the validity of a formula. It is a very well-known problem which is decidable but co-NP-complete. There are many algorithms for this problem. In this paper, an educationally oriented implementation of Semantic Tableaux method is described. The program has…
The Temporal Logic of the Tower Chief System
NASA Technical Reports Server (NTRS)
Hazelton, Lyman R., Jr.
1990-01-01
The purpose is to describe the logic used in the reasoning scheme employed in the Tower Chief system, a runway configuration management system. First, a review of classical logic is given. Defensible logics, truth maintenance, default logic, temporally dependent propositions, and resource allocation and planning are discussed.
Quantum non-objectivity from performativity of quantum phenomena
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei; Schumann, Andrew
2014-12-01
We analyze the logical foundations of quantum mechanics (QM) by stressing non-objectivity of quantum observables, which is a consequence of the absence of logical atoms in QM. We argue that the matter of quantum non-objectivity is that, on the one hand, the formalism of QM constructed as a mathematical theory is self-consistent, but, on the other hand, quantum phenomena as results of experimenters’ performances are not self-consistent. This self-inconsistency is an effect of the language of QM differing greatly from the language of human performances. The former is the language of a mathematical theory that uses some Aristotelian and Russellian assumptions (e.g., the assumption that there are logical atoms). The latter language consists of performative propositions that are self-inconsistent only from the viewpoint of conventional mathematical theory, but they satisfy another logic that is non-Aristotelian. Hence, the representation of quantum reality in linguistic terms may be different: the difference between a mathematical theory and a logic of performative propositions. To solve quantum self-inconsistency, we apply the formalism of non-classical self-referent logics.
Which Notion of Implication Is the Right One? From Logical Considerations to a Didactic Perspective.
ERIC Educational Resources Information Center
Durand-Guerrier, Viviane
2003-01-01
Summarizes Tarski's semantic truth theory to clarify different aspects of implication. Extends the classical definition of implication as a relationship between propositions to a relationship between open sentences with at least one free variable. Analyzes two problematic situations and the presentation of some experimental results from research…
The Challenges of Scientific Literacy: From the Viewpoint of Second-Generation Cognitive Science
ERIC Educational Resources Information Center
Klein, Perry D.
2006-01-01
Recent trends in cognitive science have not made scientific literacy easier to attain, but they have made the practices through which educators meet its challenges more interpretable. Traditionally, cognitive scientists viewed knowledge as a set of propositions comprised of classical concepts, thought as logical inference and language as a literal…
Fuzzy branching temporal logic.
Moon, Seong-ick; Lee, Kwang H; Lee, Doheon
2004-04-01
Intelligent systems require a systematic way to represent and handle temporal information containing uncertainty. In particular, a logical framework is needed that can represent uncertain temporal information and its relationships with logical formulae. Fuzzy linear temporal logic (FLTL), a generalization of propositional linear temporal logic (PLTL) with fuzzy temporal events and fuzzy temporal states defined on a linear time model, was previously proposed for this purpose. However, many systems are best represented by branching time models in which each state can have more than one possible future path. In this paper, fuzzy branching temporal logic (FBTL) is proposed to address this problem. FBTL adopts and generalizes concurrent tree logic (CTL*), which is a classical branching temporal logic. The temporal model of FBTL is capable of representing fuzzy temporal events and fuzzy temporal states, and the order relation among them is represented as a directed graph. The utility of FBTL is demonstrated using a fuzzy job shop scheduling problem as an example.
People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions.
Nakamura, Hiroko; Kawaguchi, Jun
2016-01-01
Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners' feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners' feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people's intuitive interpretation of the conditional "if p then q" fits better with the conditional probability, q given p.
Piaget's Logic of Meanings: Still Relevant Today
ERIC Educational Resources Information Center
Wavering, Michael James
2011-01-01
In his last book, "Toward a Logic of Meanings" (Piaget & Garcia, 1991), Jean Piaget describes how thought can be categorized into a form of propositional logic, a logic of meanings. The intent of this article is to offer this analysis by Piaget as a means to understand the language and teaching of science. Using binary propositions, conjunctions,…
People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions
2016-01-01
Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners’ feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners’ feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people’s intuitive interpretation of the conditional “if p then q” fits better with the conditional probability, q given p. PMID:28036402
Reasoning about logical propositions and success in science
NASA Astrophysics Data System (ADS)
Piburn, Michael D.
1990-12-01
Students display a number of misconceptions when asked to reason about logical propositions. Rather than being random, these misconceptions are stereotypic, and relate to age, ability, and success in science. The grades in science achieved by tenth-grade general science students from two parochial single-sex schools in Australia correlated with their scores on the Propositional Logic Test. The students' ability level was consistently related to the pattern of errors they committed on that measure. Mean scores were lowest on a subtest of ability to use the biconditional and implication, higher on the disjunction, and highest on the conjunction. Success in science was predicted most strongly by the disjunction and biconditional subtests. Knowledge of the way in which a person reasons about logical propositions provides additional insights into the transformations information is subjected to as it is integrated into mental schemata.
A Current Logical Framework: The Propositional Fragment
2003-01-01
Under the Curry- Howard isomorphism, M can also be read as a proof term, and A as a proposition of intuitionistic linear logic in its formulation as DILL...the obliga- tion to ensure that the underlying logic (via the Curry- Howard isomorphism, if you like) is sensible. In particular, the principles of...Proceedings of the International Logic Programming Symposium (ILPS), pages 51-65, Portland, Oregon, December 1995. MIT Press. 6. G. Bellin and P. J
NASA Astrophysics Data System (ADS)
Sheehan, T.; Baker, B.; Degagne, R. S.
2015-12-01
With the abundance of data sources, analytical methods, and computer models, land managers are faced with the overwhelming task of making sense of a profusion of data of wildly different types. Luckily, fuzzy logic provides a method to work with different types of data using language-based propositions such as "the landscape is undisturbed," and a simple set of logic constructs. Just as many surveys allow different levels of agreement with a proposition, fuzzy logic allows values reflecting different levels of truth for a proposition. Truth levels fall within a continuum ranging from Fully True to Fully False. Hence a fuzzy logic model produces continuous results. The Environmental Evaluation Modeling System (EEMS) is a platform-independent, tree-based, fuzzy logic modeling framework. An EEMS model provides a transparent definition of an evaluation model and is commonly developed as a collaborative effort among managers, scientists, and GIS experts. Managers specify a set of evaluative propositions used to characterize the landscape. Scientists, working with managers, formulate functions that convert raw data values into truth values for the propositions and produce a logic tree to combine results into a single metric used to guide decisions. Managers, scientists, and GIS experts then work together to implement and iteratively tune the logic model and produce final results. We present examples of two successful EEMS projects that provided managers with map-based results suitable for guiding decisions: sensitivity and climate change exposure in Utah and the Colorado Plateau modeled for the Bureau of Land Management; and terrestrial ecological intactness in the Mojave and Sonoran region of southern California modeled for the Desert Renewable Energy Conservation Plan.
ERIC Educational Resources Information Center
Treagust, David F.
1979-01-01
Comments on the study reported by Lawson, Karplus, and Adi (1978) which indicated that formal schemata and propositional logic are not part of the same structured unity of mental operations proposed by Piaget. (HM)
Application of linear logic to simulation
NASA Astrophysics Data System (ADS)
Clarke, Thomas L.
1998-08-01
Linear logic, since its introduction by Girard in 1987 has proven expressive and powerful. Linear logic has provided natural encodings of Turing machines, Petri nets and other computational models. Linear logic is also capable of naturally modeling resource dependent aspects of reasoning. The distinguishing characteristic of linear logic is that it accounts for resources; two instances of the same variable are considered differently from a single instance. Linear logic thus must obey a form of the linear superposition principle. A proportion can be reasoned with only once, unless a special operator is applied. Informally, linear logic distinguishes two kinds of conjunction, two kinds of disjunction, and also introduces a modal storage operator that explicitly indicates propositions that can be reused. This paper discuses the application of linear logic to simulation. A wide variety of logics have been developed; in addition to classical logic, there are fuzzy logics, affine logics, quantum logics, etc. All of these have found application in simulations of one sort or another. The special characteristics of linear logic and its benefits for simulation will be discussed. Of particular interest is a connection that can be made between linear logic and simulated dynamics by using the concept of Lie algebras and Lie groups. Lie groups provide the connection between the exponential modal storage operators of linear logic and the eigen functions of dynamic differential operators. Particularly suggestive are possible relations between complexity result for linear logic and non-computability results for dynamical systems.
DNA strand displacement system running logic programs.
Rodríguez-Patón, Alfonso; Sainz de Murieta, Iñaki; Sosík, Petr
2014-01-01
The paper presents a DNA-based computing model which is enzyme-free and autonomous, not requiring a human intervention during the computation. The model is able to perform iterated resolution steps with logical formulae in conjunctive normal form. The implementation is based on the technique of DNA strand displacement, with each clause encoded in a separate DNA molecule. Propositions are encoded assigning a strand to each proposition p, and its complementary strand to the proposition ¬p; clauses are encoded comprising different propositions in the same strand. The model allows to run logic programs composed of Horn clauses by cascading resolution steps. The potential of the model is demonstrated also by its theoretical capability of solving SAT. The resulting SAT algorithm has a linear time complexity in the number of resolution steps, whereas its spatial complexity is exponential in the number of variables of the formula. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Specifying structural constraints of architectural patterns in the ARCHERY language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, Alejandro; HASLab INESC TEC and Universidade do Minho, Campus de Gualtar, 4710-057 Braga; Barbosa, Luis S.
ARCHERY is an architectural description language for modelling and reasoning about distributed, heterogeneous and dynamically reconfigurable systems in terms of architectural patterns. The language supports the specification of architectures and their reconfiguration. This paper introduces a language extension for precisely describing the structural design decisions that pattern instances must respect in their (re)configurations. The extension is a propositional modal logic with recursion and nominals referencing components, i.e., a hybrid µ-calculus. Its expressiveness allows specifying safety and liveness constraints, as well as paths and cycles over structures. Refinements of classic architectural patterns are specified.
Towards An Engineering Discipline of Computational Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mili, Ali; Sheldon, Frederick T; Jilani, Lamia Labed
2007-01-01
George Boole ushered the era of modern logic by arguing that logical reasoning does not fall in the realm of philosophy, as it was considered up to his time, but in the realm of mathematics. As such, logical propositions and logical arguments are modeled using algebraic structures. Likewise, we submit that security attributes must be modeled as formal mathematical propositions that are subject to mathematical analysis. In this paper, we approach this problem by attempting to model security attributes in a refinement-like framework that has traditionally been used to represent reliability and safety claims. Keywords: Computable security attributes, survivability, integrity,more » dependability, reliability, safety, security, verification, testing, fault tolerance.« less
A New Approach to Teaching Mathematics
1994-02-01
We propose a new approach to teaching discrete math : First, teach logic as a powerful and versatile tool for discovering and communicating truths...using logic in other areas of study. Our experiences in teaching discrete math at Cornell shows that such success is possible. Propositional logic, Predicate logic, Discrete mathematics.
Instantons in Self-Organizing Logic Gates
NASA Astrophysics Data System (ADS)
Bearden, Sean R. B.; Manukian, Haik; Traversa, Fabio L.; Di Ventra, Massimiliano
2018-03-01
Self-organizing logic is a recently suggested framework that allows the solution of Boolean truth tables "in reverse"; i.e., it is able to satisfy the logical proposition of gates regardless to which terminal(s) the truth value is assigned ("terminal-agnostic logic"). It can be realized if time nonlocality (memory) is present. A practical realization of self-organizing logic gates (SOLGs) can be done by combining circuit elements with and without memory. By employing one such realization, we show, numerically, that SOLGs exploit elementary instantons to reach equilibrium points. Instantons are classical trajectories of the nonlinear equations of motion describing SOLGs and connect topologically distinct critical points in the phase space. By linear analysis at those points, we show that these instantons connect the initial critical point of the dynamics, with at least one unstable direction, directly to the final fixed point. We also show that the memory content of these gates affects only the relaxation time to reach the logically consistent solution. Finally, we demonstrate, by solving the corresponding stochastic differential equations, that, since instantons connect critical points, noise and perturbations may change the instanton trajectory in the phase space but not the initial and final critical points. Therefore, even for extremely large noise levels, the gates self-organize to the correct solution. Our work provides a physical understanding of, and can serve as an inspiration for, models of bidirectional logic gates that are emerging as important tools in physics-inspired, unconventional computing.
Describing the What and Why of Students' Difficulties in Boolean Logic
ERIC Educational Resources Information Center
Herman, Geoffrey L.; Loui, Michael C.; Kaczmarczyk, Lisa; Zilles, Craig
2012-01-01
The ability to reason with formal logic is a foundational skill for computer scientists and computer engineers that scaffolds the abilities to design, debug, and optimize. By interviewing students about their understanding of propositional logic and their ability to translate from English specifications to Boolean expressions, we characterized…
The Wave Logic of Consciousness: A Hypothesis
NASA Astrophysics Data System (ADS)
Orlov, Yuri F.
1982-01-01
A physical model is proposed for volitional decision making. It is postulated that consciousness reduces doubt states of the brain into labels by a quantum-mechanical measurement act of free choice. Elementary doubt states illustrate analogical encodement of information having “insufficient resolution” from a classical viewpoint. Measures of certitude (inner conviction) and doubt are formulated. “Adequate propositions” for nonclassical statements, e.g., Hamlet's soliloquy, are constructed. A role is proposed for the superposition principle in imagination and creativity. Experimental predictions are offered for positive and negative interference of doubts. Necessary criteria are made explicit for doubting sense information. Wholeness of perception is illustrated using irreducible, unitary representations of n-valued logics. The interpreted formalism includes nonclassical features of doubt, e.g., scalor representations for imprecise propositions and state changes due to self-reflection. The “liar paradox” is resolved. An internal origin is suggested for spinor dichotomies, e.g., “true-false” and “good-bad,” analogous to particle production.
Syllogistic reasoning in fuzzy logic and its application to usuality and reasoning with dispositions
NASA Technical Reports Server (NTRS)
Zadeh, L. A.
1985-01-01
A fuzzy syllogism in fuzzy logic is defined to be an inference schema in which the major premise, the minor premise and the conclusion are propositions containing fuzzy quantifiers. A basic fuzzy syllogism in fuzzy logic is the intersection/product syllogism. Several other basic syllogisms are developed that may be employed as rules of combination of evidence in expert systems. Among these is the consequent conjunction syllogism. Furthermore, it is shown that syllogistic reasoning in fuzzy logic provides a basis for reasoning with dispositions; that is, with propositions that are preponderantly but not necessarily always true. It is also shown that the concept of dispositionality is closely related to the notion of usuality and serves as a basis for what might be called a theory of usuality - a theory which may eventually provide a computational framework for commonsense reasoning.
Answer Sets in a Fuzzy Equilibrium Logic
NASA Astrophysics Data System (ADS)
Schockaert, Steven; Janssen, Jeroen; Vermeir, Dirk; de Cock, Martine
Since its introduction, answer set programming has been generalized in many directions, to cater to the needs of real-world applications. As one of the most general “classical” approaches, answer sets of arbitrary propositional theories can be defined as models in the equilibrium logic of Pearce. Fuzzy answer set programming, on the other hand, extends answer set programming with the capability of modeling continuous systems. In this paper, we combine the expressiveness of both approaches, and define answer sets of arbitrary fuzzy propositional theories as models in a fuzzification of equilibrium logic. We show that the resulting notion of answer set is compatible with existing definitions, when the syntactic restrictions of the corresponding approaches are met. We furthermore locate the complexity of the main reasoning tasks at the second level of the polynomial hierarchy. Finally, as an illustration of its modeling power, we show how fuzzy equilibrium logic can be used to find strong Nash equilibria.
Continuous Time in Consistent Histories
NASA Astrophysics Data System (ADS)
Savvidou, Konstantina
1999-12-01
We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.
Machine Learning-based Intelligent Formal Reasoning and Proving System
NASA Astrophysics Data System (ADS)
Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia
2018-03-01
The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.
NASA Astrophysics Data System (ADS)
Nouwen, Rick; van Rooij, Robert; Sauerland, Uli; Schmitz, Hans-Christian
One could define vagueness as the existence of borderline cases and characterise the philosophical debate on vagueness as being about the nature of these. The prevalent theories of vagueness can be divided into three categories, paralleling three logical interpretations of borderline cases: (i) a borderline case is a case of a truth-value gap; it is neither true nor false; (ii) a borderline case is a case of a truth-value glut; it is both true and false; and (iii) a borderline case is a case where the truth-value is non-classical. The third of these is proposed in the fuzzy logic approach to vagueness. Three-valued approaches have only 1/2 as a value in addition to the standard values 1 and 0. These approaches can be interpreted either as allowing for gaps or gluts, depending on how the notion of satisfaction or truth is defined. If a sentence is taken to be true only if its value is 1, it allows for gaps, but if it is taken to be true already if its value is at least 1/2 it allows for gluts. The most popular theories advertising gluts and gaps, however, are supervaluationism and subvaluationism, both of which make use of the notion of precisifications, that is, ways of making things precise. Truth-value gaps in supervaluationism are due to the way truth simpliciter, or supertruth, is defined: A proposition is supertrue (superfalse) if it is true (false) at all precisifications. This means that a proposition can be neither true nor false in case there exist two precisifications, one of which make it true and one of which makes it false. Conversely, in subvaluation theory, the same scenario would lead to a truth-value glut. That is, the proposition would be both true and false. This is because subvaluationism defines truth simpliciter as being true at some precisifcation.
"Hence"--An Iconoclastic Study of Logic, Language and Argumentation.
ERIC Educational Resources Information Center
Van der Auwera, Johan
An analysis of the role of the word "hence" and its near-synonyms examines the relationship between logic as a science, as a natural language, and as argumentation. The analysis is done in the context of elementary propositional logic. The first section is a limited discussion of the standard logician's treatment relegating "hence" to the realm of…
Enhancing Retrieval with Hyperlinks: A General Model Based on Propositional Argumentation Systems.
ERIC Educational Resources Information Center
Picard, Justin; Savoy, Jacques
2003-01-01
Discusses the use of hyperlinks for improving information retrieval on the World Wide Web and proposes a general model for using hyperlinks based on Probabilistic Argumentation Systems. Topics include propositional logic, knowledge, and uncertainty; assumptions; using hyperlinks to modify document score and rank; and estimating the popularity of a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, N.; Koller, D.; Halpern, J.Y.
Conditional logics play an important role in recent attempts to investigate default reasoning. This paper investigates first-order conditional logic. We show that, as for first-order probabilistic logic, it is important not to confound statistical conditionals over the domain (such as {open_quotes}most birds fly{close_quotes}), and subjective conditionals over possible worlds (such as I believe that Tweety is unlikely to fly). We then address the issue of ascribing semantics to first-order conditional logic. As in the propositional case, there are many possible semantics. To study the problem in a coherent way, we use plausibility structures. These provide us with a general frameworkmore » in which many of the standard approaches can be embedded. We show that while these standard approaches are all the same at the propositional level, they are significantly different in the context of a first-order language. We show that plausibilities provide the most natural extension of conditional logic to the first-order case: We provide a sound and complete axiomatization that contains only the KLM properties and standard axioms of first-order modal logic. We show that most of the other approaches have additional properties, which result in an inappropriate treatment of an infinitary version of the lottery paradox.« less
The Pythagorean Proposition, Classics in Mathematics Education Series.
ERIC Educational Resources Information Center
Loomis, Elisha Scott
This book is a reissue of the second edition which appeared in 1940. It has the distinction of being the first vintage mathematical work published in the NCTM series "Classics in Mathematics Education." The text includes a biography of Pythagoras and an account of historical data pertaining to his proposition. The remainder of the book shows 370…
Heat exchanger expert system logic
NASA Technical Reports Server (NTRS)
Cormier, R.
1988-01-01
The reduction is described of the operation and fault diagnostics of a Deep Space Network heat exchanger to a rule base by the application of propositional calculus to a set of logic statements. The value of this approach lies in the ease of converting the logic and subsequently implementing it on a computer as an expert system. The rule base was written in Process Intelligent Control software.
Dynamic Order Algebras as an Axiomatization of Modal and Tense Logics
NASA Astrophysics Data System (ADS)
Chajda, Ivan; Paseka, Jan
2015-12-01
The aim of the paper is to introduce and describe tense operators in every propositional logic which is axiomatized by means of an algebra whose underlying structure is a bounded poset or even a lattice. We introduce the operators G, H, P and F without regard what propositional connectives the logic includes. For this we use the axiomatization of universal quantifiers as a starting point and we modify these axioms for our reasons. At first, we show that the operators can be recognized as modal operators and we study the pairs ( P, G) as the so-called dynamic order pairs. Further, we get constructions of these operators in the corresponding algebra provided a time frame is given. Moreover, we solve the problem of finding a time frame in the case when the tense operators are given. In particular, any tense algebra is representable in its Dedekind-MacNeille completion. Our approach is fully general, we do not relay on the logic under consideration and hence it is applicable in all the up to now known cases.
The poverty of embodied cognition.
Goldinger, Stephen D; Papesh, Megan H; Barnhart, Anthony S; Hansen, Whitney A; Hout, Michael C
2016-08-01
In recent years, there has been rapidly growing interest in embodied cognition, a multifaceted theoretical proposition that (1) cognitive processes are influenced by the body, (2) cognition exists in the service of action, (3) cognition is situated in the environment, and (4) cognition may occur without internal representations. Many proponents view embodied cognition as the next great paradigm shift for cognitive science. In this article, we critically examine the core ideas from embodied cognition, taking a "thought exercise" approach. We first note that the basic principles from embodiment theory are either unacceptably vague (e.g., the premise that perception is influenced by the body) or they offer nothing new (e.g., cognition evolved to optimize survival, emotions affect cognition, perception-action couplings are important). We next suggest that, for the vast majority of classic findings in cognitive science, embodied cognition offers no scientifically valuable insight. In most cases, the theory has no logical connections to the phenomena, other than some trivially true ideas. Beyond classic laboratory findings, embodiment theory is also unable to adequately address the basic experiences of cognitive life.
The Poverty of Embodied Cognition
Goldinger, Stephen D.; Papesh, Megan H.; Barnhart, Anthony S.; Hansen, Whitney A.; Hout, Michael C.
2016-01-01
In recent years, there has been rapidly growing interest in Embodied Cognition, a multifaceted theoretical proposition that (1) cognitive processes are influenced by the body, (2) cognition exists in the service of action, (3) cognition is situated in the environment, and (4) cognition may occur without internal representations. Many proponents view embodied cognition as the next great paradigm shift for cognitive science. In this article, we critically examine the core ideas from embodied cognition, taking a “thought exercise” approach. We first note that the basic principles from embodiment theory are either unacceptably vague (e.g., the premise that perception is influenced by the body) or they offer nothing new (e.g., cognition evolved to optimize survival, emotions affect cognition, perception-action couplings are important). We next suggest that, for the vast majority of classic findings in cognitive science, embodied cognition offers no scientifically valuable insight. In most cases, the theory has no logical connections to the phenomena, other than some trivially true ideas. Beyond classic laboratory findings, embodiment theory is also unable to adequately address the basic experiences of cognitive life. PMID:27282990
The Logic of Evaluative Argument. CSE Monograph Series in Evaluation, 7.
ERIC Educational Resources Information Center
House, Ernest R.
Evaluation is an act of persuasion directed to a specific audience concerning the solution of a problem. The process of evaluation is prescribed by the nature of knowledge--which is generally complex, always uncertain (in varying degrees), and not always propositional--and by the nature of logic, which is always selective. In the process of…
a New Architecture for Intelligent Systems with Logic Based Languages
NASA Astrophysics Data System (ADS)
Saini, K. K.; Saini, Sanju
2008-10-01
People communicate with each other in sentences that incorporate two kinds of information: propositions about some subject, and metalevel speech acts that specify how the propositional information is used—as an assertion, a command, a question, or a promise. By means of speech acts, a group of people who have different areas of expertise can cooperate and dynamically reconfigure their social interactions to perform tasks and solve problems that would be difficult or impossible for any single individual. This paper proposes a framework for intelligent systems that consist of a variety of specialized components together with logic-based languages that can express propositions and speech acts about those propositions. The result is a system with a dynamically changing architecture that can be reconfigured in various ways: by a human knowledge engineer who specifies a script of speech acts that determine how the components interact; by a planning component that generates the speech acts to redirect the other components; or by a committee of components, which might include human assistants, whose speech acts serve to redirect one another. The components communicate by sending messages to a Linda-like blackboard, in which components accept messages that are either directed to them or that they consider themselves competent to handle.
ERIC Educational Resources Information Center
Nasser, Ramzi; Carifio, James
The purpose of this study was to find out whether students perform differently on algebra word problems that have certain key context features and entail proportional reasoning, relative to their level of logical reasoning and their degree of field dependence/independence. Field-independent students tend to restructure and break stimuli into parts…
1982-01-01
Speed Reasoning in Propositional Logic," Proposal to AF Office of Scientific Researchp July 1981. 3. Roussel, P., OPRLOG: manuel de reference et ...dlutilization,o Groupe d’Intelligence Artificielle , Universite d’Aix- Marseille, Luminy, France, September 1975. 4. Clocksint W.F. and C.S. Mellish...for reasoning in higher-order logics such as the first-order predicate calculus; the latter is required for applications in artificial intelligence
Using Edit Distance to Analyse Errors in a Natural Language to Logic Translation Corpus
ERIC Educational Resources Information Center
Barker-Plummer, Dave; Dale, Robert; Cox, Richard; Romanczuk, Alex
2012-01-01
We have assembled a large corpus of student submissions to an automatic grading system, where the subject matter involves the translation of natural language sentences into propositional logic. Of the 2.3 million translation instances in the corpus, 286,000 (approximately 12%) are categorized as being in error. We want to understand the nature of…
Contradictory Reasoning Network: An EEG and fMRI Study
Thai, Ngoc Jade; Seri, Stefano; Rotshtein, Pia; Tecchio, Franca
2014-01-01
Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication. PMID:24667491
Contradictory reasoning network: an EEG and FMRI study.
Porcaro, Camillo; Medaglia, Maria Teresa; Thai, Ngoc Jade; Seri, Stefano; Rotshtein, Pia; Tecchio, Franca
2014-01-01
Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication.
An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem
NASA Technical Reports Server (NTRS)
Hosheleva, Olga
1997-01-01
How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.
From Indexed Lax Logic to Intuitionistic Logic
2008-01-07
translations extend a complete but unsound translation from lax logic to propositional logic proposed by Mendler et al [FM97], which maps ©A to ( pAq ⊃ C...a universally quantified parameter, mapping ©A to ∀x. ( pAq ⊃ C(x)) ⊃ C(x). The other possibility is to allow linearity and translate ©A to ( pAq ⊃ C...nonce. We define p〈K〉Aq = ∀x.( pAq ⊃ af(K,x)) ⊃ af(K,x) This resembles a CPS transformation of the lax modality. The formula pAq ⊃ af(K,x) is the “type
ERIC Educational Resources Information Center
Bourdieu, Pierre; Passeron, Jean-Claude
The system of relations between the educational system and the structure of relations between the social classes is examined in this book. First, an effort is made to organize into a system amenable to logical verification not only propositions which were constructed in and for the operations of this research or were seen to be logically required…
2011-01-01
Background Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). Results We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. Conclusions The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions. PMID:21429187
Bernardes, Juliana S; Carbone, Alessandra; Zaverucha, Gerson
2011-03-23
Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions.
Buttigieg, Sandra C; Hoof, Joost van
2017-07-03
Lehoux et al provide a highly valid contribution in conceptualizing value in value propositions for new health technologies and developing an analytic framework that illustrates the interplay between health innovation supply-side logic (the logic of emergence) and demand-side logic (embedding in the healthcare system). This commentary brings forth several considerations on this article. First, a detailed stakeholder analysis provides the necessary premonition of potential hurdles in the development, implementation and dissemination of a new technology. This can be achieved by categorizing potential stakeholder groups on the basis of the potential impact of future technology. Secondly, the conceptualization of value in value propositions of new technologies should not only embrace business/economic and clinical values but also ethical, professional and cultural values, as well as factoring in the notion of usability and acceptance of new technology. As a final note, the commentary emphasises the point that technology should facilitate delivery of care without negatively affecting doctor-patient communications, physical examination skills, and development of clinical knowledge. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
NASA Astrophysics Data System (ADS)
de Carvalho, Fábio Romeu; Abe, Jair Minoro
2010-11-01
Two recent non-classical logics have been used to make decision: fuzzy logic and paraconsistent annotated evidential logic Et. In this paper we present a simplified version of the fuzzy decision method and its comparison with the paraconsistent one. Paraconsistent annotated evidential logic Et, introduced by Da Costa, Vago and Subrahmanian (1991), is capable of handling uncertain and contradictory data without becoming trivial. It has been used in many applications such as information technology, robotics, artificial intelligence, production engineering, decision making etc. Intuitively, one Et logic formula is type p(a, b), in which a and b belong to [0, 1] (real interval) and represent respectively the degree of favorable evidence (or degree of belief) and the degree of contrary evidence (or degree of disbelief) found in p. The set of all pairs (a; b), called annotations, when plotted, form the Cartesian Unitary Square (CUS). This set, containing a similar order relation of real number, comprises a network, called lattice of the annotations. Fuzzy logic was introduced by Zadeh (1965). It tries to systematize the knowledge study, searching mainly to study the fuzzy knowledge (you don't know what it means) and distinguish it from the imprecise one (you know what it means, but you don't know its exact value). This logic is similar to paraconsistent annotated one, since it attributes a numeric value (only one, not two values) to each proposition (then we can say that it is an one-valued logic). This number translates the intensity (the degree) with which the preposition is true. Let's X a set and A, a subset of X, identified by the function f(x). For each element x∈X, you have y = f(x)∈[0, 1]. The number y is called degree of pertinence of x in A. Decision making theories based on these logics have shown to be powerful in many aspects regarding more traditional methods, like the one based on Statistics. In this paper we present a first study for a simplified version of decision making theory based on Fuzzy Logic (SVMFD) and a comparison with the Paraconsistent Decision Method (PDM) based on Paraconsistent Annotated Evidential Logic Eτ, already presented and summarized in this paper. An example showing the two methods is presented in the paper, as well as a comparison between them.
A logic-based method for integer programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hooker, J.; Natraj, N.R.
1994-12-31
We propose a logic-based approach to integer programming that replaces traditional branch-and-cut techniques with logical analogs. Integer variables are regarded as atomic propositions. The constraints give rise to logical formulas that are analogous to separating cuts. No continuous relaxation is used. Rather, the cuts are selected so that they can be easily solved as a discrete relaxation. (In fact, defining a relaxation and generating cuts are best seen as the same problem.) We experiment with relaxations that have a k-tree structure and can be solved by nonserial dynamic programming. We also present logic-based analogs of facet-defining cuts, Chv{acute a}tal rank,more » etc. We conclude with some preliminary computational results.« less
Classical Limit and Quantum Logic
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Fortin, Sebastian; Holik, Federico
2018-02-01
The analysis of the classical limit of quantum mechanics usually focuses on the state of the system. The general idea is to explain the disappearance of the interference terms of quantum states appealing to the decoherence process induced by the environment. However, in these approaches it is not explained how the structure of quantum properties becomes classical. In this paper, we consider the classical limit from a different perspective. We consider the set of properties of a quantum system and we study the quantum-to-classical transition of its logical structure. The aim is to open the door to a new study based on dynamical logics, that is, logics that change over time. In particular, we appeal to the notion of hybrid logics to describe semiclassical systems. Moreover, we consider systems with many characteristic decoherence times, whose sublattices of properties become distributive at different times.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
[Processes of logical thought in a case of cerebral vascular lesion].
Blanco Men ndez, R; Aguado Balsas, A M
Reasoning and logical thought processes have traditionally been attributed to frontal lobe function or,on the other hand, have been considered as diffuse functions of the brain. However, there is today evidence enough about the possibility to find dissociations in thought processes, depending on logical structure of the experimental tasks and referring to different areas of the brain, frontal and post rolandic ones. To study possible dissociations between thought structures corresponding to categorical and relational logic, on one hand, and propositional logic on the other hand. The case of a brain injured patient with vascular etiology, localized in left frontal parietal cortex, is presented. A specific battery of reasoning tests has been administered. . A differential performance at some reasoning experimental tasks has been found depending on such logical conceptual structures. The possibility of establishing dissociations among certain logical thought and intelectual functions depending on localization of possible brain lesion (frontal versus temporal) is discussed.
Ignorance is a bliss: Mathematical structure of many-box models
NASA Astrophysics Data System (ADS)
Tylec, Tomasz I.; Kuś, Marek
2018-03-01
We show that the propositional system of a many-box model is always a set-representable effect algebra. In particular cases of 2-box and 1-box models, it is an orthomodular poset and an orthomodular lattice, respectively. We discuss the relation of the obtained results with the so-called Local Orthogonality principle. We argue that non-classical properties of box models are the result of a dual enrichment of the set of states caused by the impoverishment of the set of propositions. On the other hand, quantum mechanical models always have more propositions as well as more states than the classical ones. Consequently, we show that the box models cannot be considered as generalizations of quantum mechanical models and seeking additional principles that could allow us to "recover quantum correlations" in box models are, at least from the fundamental point of view, pointless.
Logic integer programming models for signaling networks.
Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert
2009-05-01
We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.
Modelling default and likelihood reasoning as probabilistic reasoning
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
Metaphor and Knowing: Analysis, Synthesis, Rationale.
ERIC Educational Resources Information Center
Rico, Gabriele Lusser
Evidence is presented to indicate that human knowing involves both a propositional mode stressing discourse, sequence, and logic and an appositional mode characterized by metaphoric constructs, holistic relationships, and the capacity to process many variables simultaneously. Separate sections discuss our culture's heavy emphasis on propositional…
Neural networks and logical reasoning systems: a translation table.
Martins, J; Mendes, R V
2001-04-01
A correspondence is established between the basic elements of logic reasoning systems (knowledge bases, rules, inference and queries) and the structure and dynamical evolution laws of neural networks. The correspondence is pictured as a translation dictionary which might allow to go back and forth between symbolic and network formulations, a desirable step in learning-oriented systems and multicomputer networks. In the framework of Horn clause logics, it is found that atomic propositions with n arguments correspond to nodes with nth order synapses, rules to synaptic intensity constraints, forward chaining to synaptic dynamics and queries either to simple node activation or to a query tensor dynamics.
Introspective Multistrategy Learning: Constructing a Learning Strategy under Reasoning Failure
1996-02-01
a questionnaire and then correlated with sub- sequent memory performance in experiments. For a thorough review of this line of research, see Hultsch ...275 Hinrichs, T. R. vi, 134 Hmelo, C. vi Hofstadter, D. R. 263, 293 Holland, J. H. 201 Horty, J. 264 Hultsch , D. F. 275 Hume, G. 260 Hunter, L. E. 9...hope 254 Horn-clause logic 257, 261 Horn-clause propositional logic representa- tions 245 Horn-clause rules 56 Horty, J. 264 Hultsch , D. F. 275 human
A Commentary on Mill’s Logic. Book I. Of Names and Propositions.
1983-10-01
truth . [’extension and intension’ in Flew (1979)]. The presumption is that manness necessarily implies rationality, but i- only contigently...guilty and inocent ; these are contraries rather than contradictories, since there are things, such as numbers, that are neither guilty nor innocent
Re-Composing Space: Composition's Rhetorical Geography
ERIC Educational Resources Information Center
Binkley, Roberta; Smith, Marissa
2006-01-01
In the spaces where the teaching of first-year writing occurs in the North American university and community college, Composition Studies is still a relatively young discipline, and remains focused on process, thesis sentence, argument, and propositional, and linear logic as primary goals. The rhetorical practices that underlie the discipline of…
Rethinking Educational Purpose: The Socialist Challenge
ERIC Educational Resources Information Center
Malott, Curry
2012-01-01
In this essay Malott makes a case for a Marxist reading of education's role in expanding and reproducing capitalist societies. In the process he challenges the proposition that cognitive capitalism has fundamentally transformed the way in which capitalism operates. That is, rather than being guided by an internal capitalist logic, proponents of…
TRAINING RESEARCH UTILIZING MAN-COMPUTER INTERACTIONS, PROMISE AND REALITY.
ERIC Educational Resources Information Center
MCCLELLAND, WILLIAM A.
THE PAPER WAS PRESENTED AS PART OF THE AVIONICS PANEL PROGRAM ON NATURAL AND ARTIFICIAL LOGIC PROCESSORS, SPONSORED BY THE ADVISORY GROUP FOR AERONAUTICAL RESEARCH AND DEVELOPMENT, NATO. SEVERAL CONCEPTUAL PROPOSITIONS IN REGARD TO MAN AND THE COMPUTER ARE OFFERED. THE NATURE OF TRAINING RESEARCH IS EXAMINED. THERE IS ALSO A BRIEF CATEGORIZATION…
A Method of Synthesizing Large Bodies of Knowledge in the Social Sciences.
ERIC Educational Resources Information Center
Thiemann, Francis C.
Employing concepts of formal symbolic logic, the philosophy of science, computer technology, and the work of Hans Zetterberg, a format is suggested for synthesizing and increasing use of the rapidly expanding knowledge of the social sciences. Steps in the process include formulating basic propositions, utilizing computers to establish sets, and…
Annotating the Focus of Negation in Terms of Questions Under Discussion
2012-07-13
more that just a propositional logic operator. The central claims of the paper are that negation conveys implicit positiv - ity more than half of the...Rooth, 1996; Roberts, 1996; Kadmon, 2001), and only indirectly leads to positiv - ity. 2 Delimiting Focus of Negation 2.1 What Focus of Negation is
The Everett-Wheeler interpretation and the open future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudbery, Anthony
2011-03-28
I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.
Skelcher, Chris; Smith, Steven Rathgeb
2015-06-01
We propose a novel approach to theorizing hybridity in public and nonprofit organizations. The concept of hybridity is widely used to describe organizational responses to changes in governance, but the literature seldom explains how hybrids arise or what forms they take. Transaction cost and organizational design literatures offer some solutions, but lack a theory of agency. We use the institutional logics approach to theorize hybrids as entities that face a plurality of normative frames. Logics provide symbolic and material elements that structure organizational legitimacy and actor identities. Contradictions between institutional logics offer space for them to be elaborated and creatively reconstructed by situated agents. We propose five types of organizational hybridity - segmented, segregated, assimilated, blended, and blocked. Each type is theoretically derived from empirically observed variations in organizational responses to institutional plurality. We develop propositions to show how our approach to hybridity adds value to academic and policy-maker audiences.
Shim, Jae-Mahn
2017-01-01
Drawing on the theory of social action in organizational and institutional sociology, this paper examines the behavioral consequences of plural logics of action. It addresses the question based on the empirical case of plural medical systems that are composed of both biomedicine and alternative medicine. Applying mixed methods of a cross-national panel data analysis and a content analysis of medical journal articles, it finds that plural systems affect health outcomes negatively when tensions between biomedicine and alternative medicine are unaddressed. In contrast, plural systems produce tangible health benefits when biomedicine and alternative medicine are coordinated through government policies or by health care organizations/professionals. This paper proposes plurality coordination as an important mechanism that modifies the behavioral consequences of plural logics. This proposition contributes to providing theoretical answers to the sociological puzzle that plural logics of action produce inconsistent behavioral consequences. PMID:29253867
Training propositional reasoning.
Klauer, K C; Meiser, T; Naumer, B
2000-08-01
Two experiments compared the effects of four training conditions on propositional reasoning. A syntactic training demonstrated formal derivations, in an abstract semantic training the standard truth-table definitions of logical connectives were explained, and a domain-specific semantic training provided thematic contexts for the premises of the reasoning task. In a control training, an inductive reasoning task was practised. In line with the account by mental models, both kinds of semantic training were significantly more effective than the control and the syntactic training, whereas there were no significant differences between the control and the syntactic training, nor between the two kinds of semantic training. Experiment 2 replicated this pattern of effects using a different set of syntactic and domain-specific training conditions.
Relevance, Derogation and Permission
NASA Astrophysics Data System (ADS)
Stolpe, Audun
We show that a recently developed theory of positive permission based on the notion of derogation is hampered by a triviality result that indicates a problem with the underlying full-meet contraction operation. We suggest a solution that presupposes a particular normal form for codes of norms, adapted from the theory of relevance through propositional letter sharing. We then establish a correspondence between contractions on sets of norms in input/output logic (derogations), and AGM-style contractions on sets of formulae, and use it as a bridge to migrate results on propositional relevance from the latter to the former idiom. Changing the concept accordingly we show that positive permission now incorporates a relevance requirement that wards off triviality.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2014-03-01
A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.
Bird's-eye view on noise-based logic.
Kish, Laszlo B; Granqvist, Claes G; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M
2014-01-01
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as ( i ) What does practical determinism mean? ( ii ) Is noise-based logic a Turing machine? ( iii ) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, ( iv ) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
Bird's-eye view on noise-based logic
NASA Astrophysics Data System (ADS)
Kish, Laszlo B.; Granqvist, Claes G.; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M.
2014-09-01
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as (i) What does practical determinism mean? (ii) Is noise-based logic a Turing machine? (iii) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, (iv) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
Interpreting Quantum Logic as a Pragmatic Structure
NASA Astrophysics Data System (ADS)
Garola, Claudio
2017-12-01
Many scholars maintain that the language of quantum mechanics introduces a quantum notion of truth which is formalized by (standard, sharp) quantum logic and is incompatible with the classical (Tarskian) notion of truth. We show that quantum logic can be identified (up to an equivalence relation) with a fragment of a pragmatic language LGP of assertive formulas, that are justified or unjustified rather than trueor false. Quantum logic can then be interpreted as an algebraic structure that formalizes properties of the notion of empirical justification according to quantum mechanics rather than properties of a quantum notion of truth. This conclusion agrees with a general integrationist perspective that interprets nonstandard logics as theories of metalinguistic notions different from truth, thus avoiding incompatibility with classical notions and preserving the globality of logic.
Quantum Structure in Cognition and the Foundations of Human Reasoning
NASA Astrophysics Data System (ADS)
Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas
2015-12-01
Traditional cognitive science rests on a foundation of classical logic and probability theory. This foundation has been seriously challenged by several findings in experimental psychology on human decision making. Meanwhile, the formalism of quantum theory has provided an efficient resource for modeling these classically problematical situations. In this paper, we start from our successful quantum-theoretic approach to the modeling of concept combinations to formulate a unifying explanatory hypothesis. In it, human reasoning is the superposition of two processes - a conceptual reasoning, whose nature is emergence of new conceptuality, and a logical reasoning, founded on an algebraic calculus of the logical type. In most cognitive processes however, the former reasoning prevails over the latter. In this perspective, the observed deviations from classical logical reasoning should not be interpreted as biases but, rather, as natural expressions of emergence in its deepest form.
The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory
NASA Astrophysics Data System (ADS)
Frey, Kimberly
The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.
NASA Astrophysics Data System (ADS)
Rohrlich, Fritz
2011-12-01
Classical and the quantum mechanical sciences are in essential need of mathematics. Only thus can the laws of nature be formulated quantitatively permitting quantitative predictions. Mathematics also facilitates extrapolations. But classical and quantum sciences differ in essential ways: they follow different laws of logic, Aristotelian and non-Aristotelian logics, respectively. These are explicated.
Abstract quantum computing machines and quantum computational logics
NASA Astrophysics Data System (ADS)
Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto
2016-06-01
Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.
An Application of Fuzzy Logic Control to a Classical Military Tracking Problem
1994-05-19
34, Fuzzy Sets and Systems, vol.4., 1980, pp.13-30. Berenji , Hamid R . and Pratap Khedkar. "Learning and Tuning Fuzzy Logic Controllers Through...A TRIDENT SCHOLAR PROJECT REPORT" NO. 222 "An Application of Fuzzy Logic Control to a Classical Military Tracking Problem" DTIC •S r F UNITED STATES...Zq qAvail andlor ____________________I__________ Dist SpecialDate USNA- 1531-2 REPORT DOCUMENTATION PAGE r •,,,op APmw OMB no. 0704.0188 ¢iQiiati~m.f
Soe, We-Hyo; Manzano, Carlos; Renaud, Nicolas; de Mendoza, Paula; De Sarkar, Abir; Ample, Francisco; Hliwa, Mohamed; Echavarren, Antonio M; Chandrasekhar, Natarajan; Joachim, Christian
2011-02-22
Quantum states of a trinaphthylene molecule were manipulated by putting its naphthyl branches in contact with single Au atoms. One Au atom carries 1-bit of classical information input that is converted into quantum information throughout the molecule. The Au-trinaphthylene electronic interactions give rise to measurable energy shifts of the molecular electronic states demonstrating a NOR logic gate functionality. The NOR truth table of the single molecule logic gate was characterized by means of scanning tunnelling spectroscopy.
D'Ariano, Giacomo Mauro
2018-07-13
Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
Shimp, Charles P
2004-06-30
Research on categorization has changed over time, and some of these changes resemble how Wittgenstein's views changed from his Tractatus Logico-Philosophicus to his Philosophical Investigations. Wittgenstein initially focused on unambiguous, abstract, parsimonious, logical propositions and rules, and on independent, static, "atomic facts." This approach subsequently influenced the development of logical positivism and thereby may have indirectly influenced method and theory in research on categorization: much animal research on categorization has focused on learning simple, static, logical rules unambiguously interrelating small numbers of independent features. He later rejected logical simplicity and rigor and focused instead on Gestalt ideas about figure-ground reversals and context, the ambiguity of family resemblance, and the function of details of everyday language. Contemporary contextualism has been influenced by this latter position, some features of which appear in contemporary empirical research on categorization. These developmental changes are illustrated by research on avian local and global levels of visual perceptual analysis, categorization of rectangles and moving objects, and artificial grammar learning. Implications are described for peer review of quantitative theory in which ambiguity, logical rigor, simplicity, or dynamics are designed to play important roles.
SKELCHER, CHRIS; SMITH, STEVEN RATHGEB
2015-01-01
We propose a novel approach to theorizing hybridity in public and nonprofit organizations. The concept of hybridity is widely used to describe organizational responses to changes in governance, but the literature seldom explains how hybrids arise or what forms they take. Transaction cost and organizational design literatures offer some solutions, but lack a theory of agency. We use the institutional logics approach to theorize hybrids as entities that face a plurality of normative frames. Logics provide symbolic and material elements that structure organizational legitimacy and actor identities. Contradictions between institutional logics offer space for them to be elaborated and creatively reconstructed by situated agents. We propose five types of organizational hybridity – segmented, segregated, assimilated, blended, and blocked. Each type is theoretically derived from empirically observed variations in organizational responses to institutional plurality. We develop propositions to show how our approach to hybridity adds value to academic and policy-maker audiences. PMID:26640298
Paraconsistent Reasoning for OWL 2
NASA Astrophysics Data System (ADS)
Ma, Yue; Hitzler, Pascal
A four-valued description logic has been proposed to reason with description logic based inconsistent knowledge bases. This approach has a distinct advantage that it can be implemented by invoking classical reasoners to keep the same complexity as under the classical semantics. However, this approach has so far only been studied for the basic description logic mathcal{ALC}. In this paper, we further study how to extend the four-valued semantics to the more expressive description logic mathcal{SROIQ} which underlies the forthcoming revision of the Web Ontology Language, OWL 2, and also investigate how it fares when adapted to tractable description logics including mathcal{EL++}, DL-Lite, and Horn-DLs. We define the four-valued semantics along the same lines as for mathcal{ALC} and show that we can retain most of the desired properties.
Human Action Recognition in Surveillance Videos using Abductive Reasoning on Linear Temporal Logic
2012-08-29
help of the optical flows (Lucas 75 and Kanade, 1981). 76 3.2 Atomic Propositions 77 isAt (ti, Oj, Lk) Object Oj is at location Lk at time...simultaneously at two locations in the same frame. This can 84 be represented mathematically as: 85 isAt (ti, Oj, Lk... isAt (ti, Oj, Lm) Lk Lm
Adlassnig, Klaus-Peter; Fehre, Karsten; Rappelsberger, Andrea
2015-01-01
This study's objective is to develop and use a scalable genuine technology platform for clinical decision support based on Arden Syntax, which was extended by fuzzy set theory and fuzzy logic. Arden Syntax is a widely recognized formal language for representing clinical and scientific knowledge in an executable format, and is maintained by Health Level Seven (HL7) International and approved by the American National Standards Institute (ANSI). Fuzzy set theory and logic permit the representation of knowledge and automated reasoning under linguistic and propositional uncertainty. These forms of uncertainty are a common feature of patients' medical data, the body of medical knowledge, and deductive clinical reasoning.
2013-01-01
Background Increasingly, health workforces are undergoing high-level ‘re-engineering’ to help them better meet the needs of the population, workforce and service delivery. Queensland Health implemented a large scale 5-year workforce redesign program across more than 13 health-care disciplines. This study synthesized the findings from this program to identify and codify mechanisms associated with successful workforce redesign to help inform other large workforce projects. Methods This study used Inductive Logic Reasoning (ILR), a process that uses logic models as the primary functional tool to develop theories of change, which are subsequently validated through proposition testing. Initial theories of change were developed from a systematic review of the literature and synthesized using a logic model. These theories of change were then developed into propositions and subsequently tested empirically against documentary, interview, and survey data from 55 projects in the workforce redesign program. Results Three overarching principles were identified that optimized successful workforce redesign: (1) drivers for change need to be close to practice; (2) contexts need to be supportive both at the local levels and legislatively; and (3) mechanisms should include appropriate engagement, resources to facilitate change management, governance, and support structures. Attendance to these factors was uniformly associated with success of individual projects. Conclusions ILR is a transparent and reproducible method for developing and testing theories of workforce change. Despite the heterogeneity of projects, professions, and approaches used, a consistent set of overarching principles underpinned success of workforce change interventions. These concepts have been operationalized into a workforce change checklist. PMID:24330616
Satisfiability of logic programming based on radial basis function neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamadneh, Nawaf; Sathasivam, Saratha; Tilahun, Surafel Luleseged
2014-07-10
In this paper, we propose a new technique to test the Satisfiability of propositional logic programming and quantified Boolean formula problem in radial basis function neural networks. For this purpose, we built radial basis function neural networks to represent the proportional logic which has exactly three variables in each clause. We used the Prey-predator algorithm to calculate the output weights of the neural networks, while the K-means clustering algorithm is used to determine the hidden parameters (the centers and the widths). Mean of the sum squared error function is used to measure the activity of the two algorithms. We appliedmore » the developed technique with the recurrent radial basis function neural networks to represent the quantified Boolean formulas. The new technique can be applied to solve many applications such as electronic circuits and NP-complete problems.« less
SAT Encoding of Unification in EL
NASA Astrophysics Data System (ADS)
Baader, Franz; Morawska, Barbara
Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. In a recent paper, we have shown that unification in EL is NP-complete, and thus of a complexity that is considerably lower than in other Description Logics of comparably restricted expressive power. In this paper, we introduce a new NP-algorithm for solving unification problems in EL, which is based on a reduction to satisfiability in propositional logic (SAT). The advantage of this new algorithm is, on the one hand, that it allows us to employ highly optimized state-of-the-art SAT solvers when implementing an EL-unification algorithm. On the other hand, this reduction provides us with a proof of the fact that EL-unification is in NP that is much simpler than the one given in our previous paper on EL-unification.
Goodson, Patricia; Pruitt, B E; Suther, Sandy; Wilson, Kelly; Buhi, Eric
2006-04-01
Authors examined the logic (or the implicit theory) underlying 16 abstinence-only-until-marriage programs in Texas (50% of all programs funded under the federal welfare reform legislation during 2001 and 2002). Defined as a set of propositions regarding the relationship between program activities and their intended outcomes, program staff's implicit theories were summarized and compared to (a) data from studies on adolescent sexual behavior, (b) a theory-based model of youth abstinent behavior, and (c) preliminary findings from the national evaluation of Title V programs. Authors interviewed 62 program directors and instructors and employed selected principles of grounded theory to analyze interview data. Findings indicated that abstinence education staff could clearly articulate the logic guiding program activity choices. Comparisons between interview data and a theory-based model of adolescent sexual behavior revealed striking similarities. Implications of these findings for conceptualizing and evaluating abstinence-only-until-marriage (or similar) programs are examined.
A Classification of Designated Logic Systems
1988-02-01
Introduction to Logic. New York: Macmillan Publishing, 1972. Grimaldi , Ralph P . Discrete and Combinatorial Mathematics. Reading: Addison-Wesley...sound basis for understanding non-classical logic systems. I would like to thank the Air Force Institute of Technology for funding this research. vi p ...6 p ILLUSTRATIONS Figure Page 1. Two Classifications of Designated Logic Systems 13 2. Two Partitions of Two-valued Logic Systems 14 3. Two
Some practical approaches to a course on paraconsistent logic for engineers
NASA Astrophysics Data System (ADS)
Lambert-Torres, Germano; de Moraes, Carlos Henrique Valerio; Coutinho, Maurilio Pereira; Martins, Helga Gonzaga; Borges da Silva, Luiz Eduardo
2017-11-01
This paper describes a non-classical logic course primarily indicated for graduate students in electrical engineering and energy engineering. The content of this course is based on the vision that it is not enough for a student to indefinitely accumulate knowledge; it is necessary to explore all the occasions to update, deepen, and enrich that knowledge, adapting it to a complex world. Therefore, this course is not tied to theoretical formalities and tries at each moment to provide a practical view of the non-classical logic. In the real world, the inconsistencies are important and cannot be ignored because contradictory information brings relevant facts, sometimes modifying the entire result of the analysis. As consequence, the non-classical logics, such as annotated paraconsistent logic - APL, are efficiently framed in the approach of complex situations of the real world. In APL, the concepts of unknown, partial, ambiguous, and inconsistent knowledge are referred not to trivialise any system in analysis. This course presents theoretical and applicable aspects of APL, which are successfully used in decision-making structures. The course is divided into modules: Basic, 2vAPL, 3vAPL, 4vAPL, and Final Project.
1991-02-01
3 2.2 Hybrid Rule/Fact Schemas .............................................................. 3 3 THE LIMITATIONS OF RULE BASED KNOWLEDGE...or hybrid rule/fact schemas. 2 UNCLASSIFIED .WA UNCLASSIFIED ERL-0520-RR 2.1 Propositional Logic The simplest form of production-rules are based upon...requirements which may lead to poor system performance. 2.2 Hybrid Rule/Fact Schemas Hybrid rule/fact relationships (also known as Predicate Calculus ) have
On the classic and modern theories of matching.
McDowell, J J
2005-07-01
Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
ERIC Educational Resources Information Center
Losada, David E.; Barreiro, Alvaro
2003-01-01
Proposes an approach to incorporate term similarity and inverse document frequency into a logical model of information retrieval. Highlights include document representation and matching; incorporating term similarity into the measure of distance; new algorithms for implementation; inverse document frequency; and logical versus classical models of…
Quantum-classical interface based on single flux quantum digital logic
NASA Astrophysics Data System (ADS)
McDermott, R.; Vavilov, M. G.; Plourde, B. L. T.; Wilhelm, F. K.; Liebermann, P. J.; Mukhanov, O. A.; Ohki, T. A.
2018-04-01
We describe an approach to the integrated control and measurement of a large-scale superconducting multiqubit array comprising up to 108 physical qubits using a proximal coprocessor based on the Single Flux Quantum (SFQ) digital logic family. Coherent control is realized by irradiating the qubits directly with classical bitstreams derived from optimal control theory. Qubit measurement is performed by a Josephson photon counter, which provides access to the classical result of projective quantum measurement at the millikelvin stage. We analyze the power budget and physical footprint of the SFQ coprocessor and discuss challenges and opportunities associated with this approach.
The Kritzel System for handwriting interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, G.
We present a new system for recognizing on-line cursive handwriting. The system, which is called the Kritzel System, has four features. First, the system characterizes handwriting as a sequence of feature vectors. Second, the system adapts to a particular writing style itself through a learning process. Third, the reasoning of the system is formulated in propositional logic with likelihoods. Fourth, the system can be readily linked with other English processing systems for lexical and contextual checking.
Toward a Contingency Theory of Decision Making.
ERIC Educational Resources Information Center
Tarter, C. John; Hoy, Wayne K.
1998-01-01
There is no single best decision-making approach. This article reviews and compares six contemporary models (classical, administrative, incremental, mixed-scanning, garbage-can, and political) and develops a framework and 10 propositions to match strategies with circumstances. A contingency approach suggests that administrators use satisficing (a…
Quantum probabilistic logic programming
NASA Astrophysics Data System (ADS)
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyadera, Takayuki; Imai, Hideki; Graduate School of Science and Engineering, Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551
This paper discusses the no-cloning theorem in a logicoalgebraic approach. In this approach, an orthoalgebra is considered as a general structure for propositions in a physical theory. We proved that an orthoalgebra admits cloning operation if and only if it is a Boolean algebra. That is, only classical theory admits the cloning of states. If unsharp propositions are to be included in the theory, then a notion of effect algebra is considered. We proved that an atomic Archimedean effect algebra admitting cloning operation is a Boolean algebra. This paper also presents a partial result, indicating a relation between the cloningmore » on effect algebras and hidden variables.« less
Relational similarity-based model of data part 1: foundations and query systems
NASA Astrophysics Data System (ADS)
Belohlavek, Radim; Vychodil, Vilem
2017-10-01
We present a general rank-aware model of data which supports handling of similarity in relational databases. The model is based on the assumption that in many cases it is desirable to replace equalities on values in data tables by similarity relations expressing degrees to which the values are similar. In this context, we study various phenomena which emerge in the model, including similarity-based queries and similarity-based data dependencies. Central notion in our model is that of a ranked data table over domains with similarities which is our counterpart to the notion of relation on relation scheme from the classical relational model. Compared to other approaches which cover related problems, we do not propose a similarity-based or ranking module on top of the classical relational model. Instead, we generalize the very core of the model by replacing the classical, two-valued logic upon which the classical model is built by a more general logic involving a scale of truth degrees that, in addition to the classical truth degrees 0 and 1, contains intermediate truth degrees. While the classical truth degrees 0 and 1 represent nonequality and equality of values, and subsequently mismatch and match of queries, the intermediate truth degrees in the new model represent similarity of values and partial match of queries. Moreover, the truth functions of many-valued logical connectives in the new model serve to aggregate degrees of similarity. The presented approach is conceptually clean, logically sound, and retains most properties of the classical model while enabling us to employ new types of queries and data dependencies. Most importantly, similarity is not handled in an ad hoc way or by putting a "similarity module" atop the classical model in our approach. Rather, it is consistently viewed as a notion that generalizes and replaces equality in the very core of the relational model. We present fundamentals of the formal model and two equivalent query systems which are analogues of the classical relational algebra and domain relational calculus with range declarations. In the sequel to this paper, we deal with similarity-based dependencies.
NASA Astrophysics Data System (ADS)
Rapoport, Diego L.
2011-01-01
In this transdisciplinary article which stems from philosophical considerations (that depart from phenomenology—after Merleau-Ponty, Heidegger and Rosen—and Hegelian dialectics), we develop a conception based on topological (the Moebius surface and the Klein bottle) and geometrical considerations (based on torsion and non-orientability of manifolds), and multivalued logics which we develop into a unified world conception that surmounts the Cartesian cut and Aristotelian logic. The role of torsion appears in a self-referential construction of space and time, which will be further related to the commutator of the True and False operators of matrix logic, still with a quantum superposed state related to a Moebius surface, and as the physical field at the basis of Spencer-Brown's primitive distinction in the protologic of the calculus of distinction. In this setting, paradox, self-reference, depth, time and space, higher-order non-dual logic, perception, spin and a time operator, the Klein bottle, hypernumbers due to Musès which include non-trivial square roots of ±1 and in particular non-trivial nilpotents, quantum field operators, the transformation of cognition to spin for two-state quantum systems, are found to be keenly interwoven in a world conception compatible with the philosophical approach taken for basis of this article. The Klein bottle is found not only to be the topological in-formation for self-reference and paradox whose logical counterpart in the calculus of indications are the paradoxical imaginary time waves, but also a classical-quantum transformer (Hadamard's gate in quantum computation) which is indispensable to be able to obtain a complete multivalued logical system, and still to generate the matrix extension of classical connective Boolean logic. We further find that the multivalued logic that stems from considering the paradoxical equation in the calculus of distinctions, and in particular, the imaginary solutions to this equation, generates the matrix logic which supersedes the classical logic of connectives and which has for particular subtheories fuzzy and quantum logics. Thus, from a primitive distinction in the vacuum plane and the axioms of the calculus of distinction, we can derive by incorporating paradox, the world conception succinctly described above.
[Theoretical and conceptual contribution to evaluative research in health surveillance context].
Arreaza, Antônio Luis Vicente; de Moraes, José Cássio
2010-08-01
Initially this article revises some of the conceptual and operational elements on evaluative research by gathering knowledge and action fields on public health practices. Such concepts are taken according to a wider conception of quality. Then, the article intends to arrange a theoretical model design considering the proposition for implementation of health surveillance actionsAn image-objective definition of organization and integration of health polices and practices based on hierarchic and local logic also take place. Finally, becomings and challenges around the theory in the health evaluation field turn to be the aim of our reflection in order to enable the production of knowledge and approaches to construct logic models which reveals the complexity of interventionist objects as well as its transforming nature of social practices.
Siqueira-Batista, Rodrigo; Gomes, Andréia Patrícia; Albuquerque, Verônica Santos; Cavalcanti, Felipe de Oliveira Lopes; Cotta, Rosângela Minardi Mitre
2013-01-01
The transformations that have revolutionized the labor market in contemporary society make it necessary to think of new alternatives for training health care professionals, thereby establishing a new approach to the health problems of individuals and collectives. Based on these considerations, this paper sets out to discuss training in health--based on the concept of competence--with a focus on education for the Brazilian Unified Health System (SUS), using attempts to analyze and propose an alternative to the system entrenched in the logic of late capitalism as a theoretical benchmark. It is thus a reflection on the subject, correlating theory and praxis, in constant and relentless movement of construction, deconstruction and (re)construction of propositions.
New fundamental evidence of non-classical structure in the combination of natural concepts.
Aerts, D; Sozzo, S; Veloz, T
2016-01-13
We recently performed cognitive experiments on conjunctions and negations of two concepts with the aim of investigating the combination problem of concepts. Our experiments confirmed the deviations (conceptual vagueness, underextension, overextension etc.) from the rules of classical (fuzzy) logic and probability theory observed by several scholars in concept theory, while our data were successfully modelled in a quantum-theoretic framework developed by ourselves. In this paper, we isolate a new, very stable and systematic pattern of violation of classicality that occurs in concept combinations. In addition, the strength and regularity of this non-classical effect leads us to believe that it occurs at a more fundamental level than the deviations observed up to now. It is our opinion that we have identified a deep non-classical mechanism determining not only how concepts are combined but, rather, how they are formed. We show that this effect can be faithfully modelled in a two-sector Fock space structure, and that it can be exactly explained by assuming that human thought is the superposition of two processes, a 'logical reasoning', guided by 'logic', and a 'conceptual reasoning', guided by 'emergence', and that the latter generally prevails over the former. All these findings provide new fundamental support to our quantum-theoretic approach to human cognition. © 2015 The Author(s).
Quantum design rules for single molecule logic gates.
Renaud, N; Hliwa, M; Joachim, C
2011-08-28
Recent publications have demonstrated how to implement a NOR logic gate with a single molecule using its interaction with two surface atoms as logical inputs [W. Soe et al., ACS Nano, 2011, 5, 1436]. We demonstrate here how this NOR logic gate belongs to the general family of quantum logic gates where the Boolean truth table results from a full control of the quantum trajectory of the electron transfer process through the molecule by very local and classical inputs practiced on the molecule. A new molecule OR gate is proposed for the logical inputs to be also single metal atoms, one per logical input.
An interval logic for higher-level temporal reasoning
NASA Technical Reports Server (NTRS)
Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.
1983-01-01
Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.
A probability space for quantum models
NASA Astrophysics Data System (ADS)
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Use of Knowledge Base Systems (EMDS) in Strategic and Tactical Forest Planning
NASA Astrophysics Data System (ADS)
Jensen, M. E.; Reynolds, K.; Stockmann, K.
2008-12-01
The USDA Forest Service 2008 Planning Rule requires Forest plans to provide a strategic vision for maintaining the sustainability of ecological, economic, and social systems across USFS lands through the identification of desired conditions and objectives. In this paper we show how knowledge-based systems can be efficiently used to evaluate disparate natural resource information to assess desired conditions and related objectives in Forest planning. We use the Ecosystem Management Decision Support (EMDS) system (http://www.institute.redlands.edu/emds/), which facilitates development of both logic-based models for evaluating ecosystem sustainability (desired conditions) and decision models to identify priority areas for integrated landscape restoration (objectives). The study area for our analysis spans 1,057 subwatersheds within western Montana and northern Idaho. Results of our study suggest that knowledge-based systems such as EMDS are well suited to both strategic and tactical planning and that the following points merit consideration in future National Forest (and other land management) planning efforts: 1) Logic models provide a consistent, transparent, and reproducible method for evaluating broad propositions about ecosystem sustainability such as: are watershed integrity, ecosystem and species diversity, social opportunities, and economic integrity in good shape across a planning area? The ability to evaluate such propositions in a formal logic framework also allows users the opportunity to evaluate statistical changes in outcomes over time, which could be very useful for regional and national reporting purposes and for addressing litigation; 2) The use of logic and decision models in strategic and tactical Forest planning provides a repository for expert knowledge (corporate memory) that is critical to the evaluation and management of ecosystem sustainability over time. This is especially true for the USFS and other federal resource agencies, which are likely to experience rapid turnover in tenured resource specialist positions within the next five years due to retirements; 3) Use of logic model output in decision models is an efficient method for synthesizing the typically large amounts of information needed to support integrated landscape restoration. Moreover, use of logic and decision models to design customized scenarios for integrated landscape restoration, as we have demonstrated with EMDS, offers substantial improvements to traditional GIS-based procedures such as suitability analysis. To our knowledge, this study represents the first attempt to link evaluations of desired conditions for ecosystem sustainability in strategic planning to tactical planning regarding the location of subwatersheds that best meet the objectives of integrated landscape restoration. The basic knowledge-based approach implemented in EMDS, with its logic (NetWeaver) and decision (Criterion Decision Plus) engines, is well suited both to multi-scale strategic planning and to multi-resource tactical planning.
Compositional Verification with Abstraction, Learning, and SAT Solving
2015-05-01
arithmetic, and bit-vectors (currently, via bit-blasting). The front-end is based on an existing tool called UFO [8] which converts C programs to the Horn...supports propositional logic, linear arithmetic, and bit-vectors (via bit-blasting). The front-end is based on the tool UFO [8]. It encodes safety of...tool UFO [8]. The encoding in Horn-SMT only uses the theory of Linear Rational Arithmetic. All experiments were carried out on an Intel R© CoreTM2 Quad
Nonadditive entropy maximization is inconsistent with Bayesian updating
NASA Astrophysics Data System (ADS)
Pressé, Steve
2014-11-01
The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.
Nonadditive entropy maximization is inconsistent with Bayesian updating.
Pressé, Steve
2014-11-01
The maximum entropy method-used to infer probabilistic models from data-is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.
NASA Astrophysics Data System (ADS)
Šujaková, Monika; Golejová, Simona; Sakál, Peter
2017-09-01
In the contribution the authors deal with the design and use of a sustainable marketing communication strategy of an ideal industrial enterprise in the Slovak Republic. The concept of an ideal enterprise is designed to increase the enterprise's sustainable competitiveness through the formation of a corporate image. In the framework of the research, the practical application of the draft concept was realized through a semi-structured interview in the form of propositional logic.
Telerobotic control of a mobile coordinated robotic server. M.S. Thesis Annual Technical Report
NASA Technical Reports Server (NTRS)
Lee, Gordon
1993-01-01
The annual report on telerobotic control of a mobile coordinated robotic server is presented. The goal of this effort is to develop advanced control methods for flexible space manipulator systems. As such, an adaptive fuzzy logic controller was developed in which model structure as well as parameter constraints are not required for compensation. The work builds upon previous work on fuzzy logic controllers. Fuzzy logic controllers have been growing in importance in the field of automatic feedback control. Hardware controllers using fuzzy logic have become available as an alternative to the traditional PID controllers. Software has also been introduced to aid in the development of fuzzy logic rule-bases. The advantages of using fuzzy logic controllers include the ability to merge the experience and intuition of expert operators into the rule-base and that a model of the system is not required to construct the controller. A drawback of the classical fuzzy logic controller, however, is the many parameters needed to be turned off-line prior to application in the closed-loop. In this report, an adaptive fuzzy logic controller is developed requiring no system model or model structure. The rule-base is defined to approximate a state-feedback controller while a second fuzzy logic algorithm varies, on-line, parameters of the defining controller. Results indicate the approach is viable for on-line adaptive control of systems when the model is too complex or uncertain for application of other more classical control techniques.
Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)
NASA Astrophysics Data System (ADS)
Benzmüller, Christoph
Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lellouche, P.
The Atlantic Alliance was disturbed when the Strategic Defense Initiative (SDI) was conceived and presented primarily as an American unilateral initiative, with no political or strategic consultation with the allies. It was also disturbed by the confused and contradictory objectives of SDI; i.e., its rejection of the logic of deterrence as a dangerous and unethical proposition at the same time that it reinforces the logic of mutual assured destruction (MAD). Some of the basic ambiguity is related to the technology, which remains untested. SDI's purpose is not to defend people or missiles, but to destroy weapons. The author argues thatmore » SDI threatens the survival of Europe's high tech industries and could lead to a massive transfer of NATO-related defense expenditures away from European defense and toward a defensive shield of dubious value for Europeans.« less
NASA Astrophysics Data System (ADS)
Derrouazin, A.; Aillerie, M.; Mekkakia-Maaza, N.; Charles, J. P.
2016-07-01
Several researches for management of diverse hybrid energy systems and many techniques have been proposed for robustness, savings and environmental purpose. In this work we aim to make a comparative study between two supervision and control techniques: fuzzy and classic logics to manage the hybrid energy system applied for typical housing fed by solar and wind power, with rack of batteries for storage. The system is assisted by the electric grid during energy drop moments. A hydrogen production device is integrated into the system to retrieve surplus energy production from renewable sources for the household purposes, intending the maximum exploitation of these sources over years. The models have been achieved and generated signals for electronic switches command of proposed both techniques are presented and discussed in this paper.
[Influence of music on a decision of mathematical logic tasks].
Pavlygina, R A; Karamysheva, N N; Sakharov, D S; Davydov, V I
2012-01-01
Accompaniment of a decision of mathematical logical tasks by music (different style and power) influenced on the time of the decision. Classical music 35 and 65 dB and roc-music 65 and 85 dB decreased the time of the decision. More powerful classical music (85 dB) did not effect like that. The decision without the musical accompaniment led to increasing of coherent values especially in beta1, beta2, gamma frequency ranges in EEG of occipital cortex. The intrahemispheric and the interhemispheric coherences of frontal EEG increased and EEG asymmetry (in a number of Coh-connections in left and right hemispheres) arose during the tasks decision accompanied by music. Application of classical music 35 and 65 dB caused left-side asymmetry in EEG. Using of more powerful classical or rock music led to prevalence of quantity of Coh-connections in a right hemisphere.
Comparative study of multimodal biometric recognition by fusion of iris and fingerprint.
Benaliouche, Houda; Touahria, Mohamed
2014-01-01
This research investigates the comparative performance from three different approaches for multimodal recognition of combined iris and fingerprints: classical sum rule, weighted sum rule, and fuzzy logic method. The scores from the different biometric traits of iris and fingerprint are fused at the matching score and the decision levels. The scores combination approach is used after normalization of both scores using the min-max rule. Our experimental results suggest that the fuzzy logic method for the matching scores combinations at the decision level is the best followed by the classical weighted sum rule and the classical sum rule in order. The performance evaluation of each method is reported in terms of matching time, error rates, and accuracy after doing exhaustive tests on the public CASIA-Iris databases V1 and V2 and the FVC 2004 fingerprint database. Experimental results prior to fusion and after fusion are presented followed by their comparison with related works in the current literature. The fusion by fuzzy logic decision mimics the human reasoning in a soft and simple way and gives enhanced results.
Comparative Study of Multimodal Biometric Recognition by Fusion of Iris and Fingerprint
Benaliouche, Houda; Touahria, Mohamed
2014-01-01
This research investigates the comparative performance from three different approaches for multimodal recognition of combined iris and fingerprints: classical sum rule, weighted sum rule, and fuzzy logic method. The scores from the different biometric traits of iris and fingerprint are fused at the matching score and the decision levels. The scores combination approach is used after normalization of both scores using the min-max rule. Our experimental results suggest that the fuzzy logic method for the matching scores combinations at the decision level is the best followed by the classical weighted sum rule and the classical sum rule in order. The performance evaluation of each method is reported in terms of matching time, error rates, and accuracy after doing exhaustive tests on the public CASIA-Iris databases V1 and V2 and the FVC 2004 fingerprint database. Experimental results prior to fusion and after fusion are presented followed by their comparison with related works in the current literature. The fusion by fuzzy logic decision mimics the human reasoning in a soft and simple way and gives enhanced results. PMID:24605065
Brain Stretchers Book 4--Advanced.
ERIC Educational Resources Information Center
Anderson, Carolyn
This book provides puzzles, games, and mathematical activities for students in elementary grades. Number concepts and arithmetic are common topics. These classic math, logic, and word-problem activities encourage students to become flexible, creative thinkers while teaching them to draw valid conclusions based on logic and evidence. Each activity…
The empirical study of norms is just what we are missing
Achourioti, Theodora; Fugard, Andrew J. B.; Stenning, Keith
2014-01-01
This paper argues that the goals people have when reasoning determine their own norms of reasoning. A radical descriptivism which avoids norms never worked for any science; nor can it work for the psychology of reasoning. Norms as we understand them are illustrated with examples from categorical syllogistic reasoning and the “new paradigm” of subjective probabilities. We argue that many formal systems are required for psychology: classical logic, non-monotonic logics, probability logics, relevance logic, and others. One of the hardest challenges is working out what goals reasoners have and choosing and tailoring the appropriate logics to model the norms those goals imply. PMID:25368590
Learning fuzzy logic control system
NASA Technical Reports Server (NTRS)
Lung, Leung Kam
1994-01-01
The performance of the Learning Fuzzy Logic Control System (LFLCS), developed in this thesis, has been evaluated. The Learning Fuzzy Logic Controller (LFLC) learns to control the motor by learning the set of teaching values that are generated by a classical PI controller. It is assumed that the classical PI controller is tuned to minimize the error of a position control system of the D.C. motor. The Learning Fuzzy Logic Controller developed in this thesis is a multi-input single-output network. Training of the Learning Fuzzy Logic Controller is implemented off-line. Upon completion of the training process (using Supervised Learning, and Unsupervised Learning), the LFLC replaces the classical PI controller. In this thesis, a closed loop position control system of a D.C. motor using the LFLC is implemented. The primary focus is on the learning capabilities of the Learning Fuzzy Logic Controller. The learning includes symbolic representation of the Input Linguistic Nodes set and Output Linguistic Notes set. In addition, we investigate the knowledge-based representation for the network. As part of the design process, we implement a digital computer simulation of the LFLCS. The computer simulation program is written in 'C' computer language, and it is implemented in DOS platform. The LFLCS, designed in this thesis, has been developed on a IBM compatible 486-DX2 66 computer. First, the performance of the Learning Fuzzy Logic Controller is evaluated by comparing the angular shaft position of the D.C. motor controlled by a conventional PI controller and that controlled by the LFLC. Second, the symbolic representation of the LFLC and the knowledge-based representation for the network are investigated by observing the parameters of the Fuzzy Logic membership functions and the links at each layer of the LFLC. While there are some limitations of application with this approach, the result of the simulation shows that the LFLC is able to control the angular shaft position of the D.C. motor. Furthermore, the LFLC has better performance in rise time, settling time and steady state error than to the conventional PI controller. This abstract accurately represents the content of the candidate's thesis. I recommend its publication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Derrouazin, A., E-mail: derrsid@gmail.com; Université de Lorraine, LMOPS, EA 4423, 57070 Metz; CentraleSupélec, LMOPS, 57070 Metz
Several researches for management of diverse hybrid energy systems and many techniques have been proposed for robustness, savings and environmental purpose. In this work we aim to make a comparative study between two supervision and control techniques: fuzzy and classic logics to manage the hybrid energy system applied for typical housing fed by solar and wind power, with rack of batteries for storage. The system is assisted by the electric grid during energy drop moments. A hydrogen production device is integrated into the system to retrieve surplus energy production from renewable sources for the household purposes, intending the maximum exploitationmore » of these sources over years. The models have been achieved and generated signals for electronic switches command of proposed both techniques are presented and discussed in this paper.« less
Slater, Michael D
2006-01-01
While increasingly widespread use of behavior change theory is an advance for communication campaigns and their evaluation, such theories provide a necessary but not sufficient condition for theory-based communication interventions. Such interventions and their evaluations need to incorporate theoretical thinking about plausible mechanisms of message effect on health-related attitudes and behavior. Otherwise, strategic errors in message design and dissemination, and misspecified campaign logic models, insensitive to campaign effects, are likely to result. Implications of the elaboration likelihood model, attitude accessibility, attitude to the ad theory, exemplification, and framing are explored, and implications for campaign strategy and evaluation designs are briefly discussed. Initial propositions are advanced regarding a theory of campaign affect generalization derived from attitude to ad theory, and regarding a theory of reframing targeted health behaviors in those difficult contexts in which intended audiences are resistant to the advocated behavior or message.
So what exactly is nursing knowledge?
Clarke, L
2011-06-01
This paper aims to present a discussion about intrinsic nursing knowledge. The paper stems from the author's study of knowledge claims enshrined in nursing journal articles, books and conference speeches. It is argued that claims by academic nurses have largely depended on principles drawn from continental and not Analytic (British-American) philosophy. Thus, claims are credible only insofar as they defer propositional logic. This is problematic inasmuch as nursing is a practice-based activity usually carried out in medical settings. Transpersonal nursing models are particularly criticizable in respect of their unworldly character as are also concepts based on shallow usages of physics or mathematics. I argue that sensible measurements of the 'real world' are possible--without endorsing positivism--and that nursing requires little recourse to logically unsustainable claims. The paper concludes with an analysis of a recent review of nursing knowledge, which analysis indicates the circularity that attends many discussions on the topic. © 2011 Blackwell Publishing.
An inference engine for embedded diagnostic systems
NASA Technical Reports Server (NTRS)
Fox, Barry R.; Brewster, Larry T.
1987-01-01
The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.
ERIC Educational Resources Information Center
McArdle, John J.; Grimm, Kevin J.; Hamagami, Fumiaki; Bowles, Ryan P.; Meredith, William
2009-01-01
The authors use multiple-sample longitudinal data from different test batteries to examine propositions about changes in constructs over the life span. The data come from 3 classic studies on intellectual abilities in which, in combination, 441 persons were repeatedly measured as many as 16 times over 70 years. They measured cognitive constructs…
Jung, Nadine; Wranke, Christina; Hamburger, Kai; Knauff, Markus
2014-01-01
Recent experimental studies show that emotions can have a significant effect on the way we think, decide, and solve problems. This paper presents a series of four experiments on how emotions affect logical reasoning. In two experiments different groups of participants first had to pass a manipulated intelligence test. Their emotional state was altered by giving them feedback, that they performed excellent, poor or on average. Then they completed a set of logical inference problems (with if p, then q statements) either in a Wason selection task paradigm or problems from the logical propositional calculus. Problem content also had either a positive, negative or neutral emotional value. Results showed a clear effect of emotions on reasoning performance. Participants in negative mood performed worse than participants in positive mood, but both groups were outperformed by the neutral mood reasoners. Problem content also had an effect on reasoning performance. In a second set of experiments, participants with exam or spider phobia solved logical problems with contents that were related to their anxiety disorder (spiders or exams). Spider phobic participants' performance was lowered by the spider-content, while exam anxious participants were not affected by the exam-related problem content. Overall, unlike some previous studies, no evidence was found that performance is improved when emotion and content are congruent. These results have consequences for cognitive reasoning research and also for cognitively oriented psychotherapy and the treatment of disorders like depression and anxiety.
Jung, Nadine; Wranke, Christina; Hamburger, Kai; Knauff, Markus
2014-01-01
Recent experimental studies show that emotions can have a significant effect on the way we think, decide, and solve problems. This paper presents a series of four experiments on how emotions affect logical reasoning. In two experiments different groups of participants first had to pass a manipulated intelligence test. Their emotional state was altered by giving them feedback, that they performed excellent, poor or on average. Then they completed a set of logical inference problems (with if p, then q statements) either in a Wason selection task paradigm or problems from the logical propositional calculus. Problem content also had either a positive, negative or neutral emotional value. Results showed a clear effect of emotions on reasoning performance. Participants in negative mood performed worse than participants in positive mood, but both groups were outperformed by the neutral mood reasoners. Problem content also had an effect on reasoning performance. In a second set of experiments, participants with exam or spider phobia solved logical problems with contents that were related to their anxiety disorder (spiders or exams). Spider phobic participants' performance was lowered by the spider-content, while exam anxious participants were not affected by the exam-related problem content. Overall, unlike some previous studies, no evidence was found that performance is improved when emotion and content are congruent. These results have consequences for cognitive reasoning research and also for cognitively oriented psychotherapy and the treatment of disorders like depression and anxiety. PMID:24959160
Confusion in the Classroom: Does "Logos" Mean Logic?
ERIC Educational Resources Information Center
Little, Joseph
1999-01-01
Traces the term "logos" from the ninth to the fourth century B.C. to distinguish between its general meaning and its technical definition within classical rhetoric. Shows that Aristotle's "pistis" of "logos" refers, not to an appeal to logic, but to the argument or speech itself, which reinstates all three proofs of…
Consciousness and Quantum Physics: Empirical Research on the Subjective Reduction of the Statevector
NASA Astrophysics Data System (ADS)
Bierman, Dick J.; Whitmarsh, Stephen
There are two major theoretical perspectives on the relation between quantum physics and consciousness. The first one is the proposal by Hameroff and Penrose CHEXX[16] that consciousness arises from the collapse of the statevector describing nonconscious brainstates. The second perspective is the proposition that consciousness acts as the ultimate measurement device, i. e. a measurement is defined as the collapse of the statevector describing the external physical system, due to interaction with a conscious observer. The latter (dualistic) proposition has resulted in the thought experiment with Schrodinger's cat and is generally considered as extremely unlikely. However, that proposition is, under certain assumptions, open to empirical verification. This was originally done by Hall et al. CHEXX[15]. A refined experiment to test the "subjective" reduction' interpretation of the measurement problem in quantum physics was reported by Bierman CHEXX[3]. In the latter experiment, auditory evoked potentials (AEPs) of subjects observing (previously unobserved) radioactive decay were recorded. These were compared with AEPs from events that were already observed and thus supposedly already collapsed into a singular state. Significant differences in brain signals of the observer were found. In this chapter we report a further replication that is improved upon the previous experiments by adding a nonquantum event as control. Differential effects of preobservation were expected not to appear in this classical condition since the quantum character of the event is presumed crucial. No differential effects were found in either condition, however. Marginal differences were found between the quantum and classical conditions. Possible explanations for the inability to replicate the previous findings are given as well as suggestions for further research.
High-order noise filtering in nontrivial quantum logic gates.
Green, Todd; Uys, Hermann; Biercuk, Michael J
2012-07-13
Treating the effects of a time-dependent classical dephasing environment during quantum logic operations poses a theoretical challenge, as the application of noncommuting control operations gives rise to both dephasing and depolarization errors that must be accounted for in order to understand total average error rates. We develop a treatment based on effective Hamiltonian theory that allows us to efficiently model the effect of classical noise on nontrivial single-bit quantum logic operations composed of arbitrary control sequences. We present a general method to calculate the ensemble-averaged entanglement fidelity to arbitrary order in terms of noise filter functions, and provide explicit expressions to fourth order in the noise strength. In the weak noise limit we derive explicit filter functions for a broad class of piecewise-constant control sequences, and use them to study the performance of dynamically corrected gates, yielding good agreement with brute-force numerics.
NASA Astrophysics Data System (ADS)
Martín–Moruno, Prado; Visser, Matt
2017-11-01
The (generalized) Rainich conditions are algebraic conditions which are polynomial in the (mixed-component) stress-energy tensor. As such they are logically distinct from the usual classical energy conditions (NEC, WEC, SEC, DEC), and logically distinct from the usual Hawking-Ellis (Segré-Plebański) classification of stress-energy tensors (type I, type II, type III, type IV). There will of course be significant inter-connections between these classification schemes, which we explore in the current article. Overall, we shall argue that it is best to view the (generalized) Rainich conditions as a refinement of the classical energy conditions and the usual Hawking-Ellis classification.
Integrating machine learning and physician knowledge to improve the accuracy of breast biopsy.
Dutra, I; Nassif, H; Page, D; Shavlik, J; Strigel, R M; Wu, Y; Elezaby, M E; Burnside, E
2011-01-01
In this work we show that combining physician rules and machine learned rules may improve the performance of a classifier that predicts whether a breast cancer is missed on percutaneous, image-guided breast core needle biopsy (subsequently referred to as "breast core biopsy"). Specifically, we show how advice in the form of logical rules, derived by a sub-specialty, i.e. fellowship trained breast radiologists (subsequently referred to as "our physicians") can guide the search in an inductive logic programming system, and improve the performance of a learned classifier. Our dataset of 890 consecutive benign breast core biopsy results along with corresponding mammographic findings contains 94 cases that were deemed non-definitive by a multidisciplinary panel of physicians, from which 15 were upgraded to malignant disease at surgery. Our goal is to predict upgrade prospectively and avoid surgery in women who do not have breast cancer. Our results, some of which trended toward significance, show evidence that inductive logic programming may produce better results for this task than traditional propositional algorithms with default parameters. Moreover, we show that adding knowledge from our physicians into the learning process may improve the performance of the learned classifier trained only on data.
The Construction of Impossibility: A Logic-Based Analysis of Conjuring Tricks
Smith, Wally; Dignum, Frank; Sonenberg, Liz
2016-01-01
Psychologists and cognitive scientists have long drawn insights and evidence from stage magic about human perceptual and attentional errors. We present a complementary analysis of conjuring tricks that seeks to understand the experience of impossibility that they produce. Our account is first motivated by insights about the constructional aspects of conjuring drawn from magicians' instructional texts. A view is then presented of the logical nature of impossibility as an unresolvable contradiction between a perception-supported belief about a situation and a memory-supported expectation. We argue that this condition of impossibility is constructed not simply through misperceptions and misattentions, but rather it is an outcome of a trick's whole structure of events. This structure is conceptualized as two parallel event sequences: an effect sequence that the spectator is intended to believe; and a method sequence that the magician understands as happening. We illustrate the value of this approach through an analysis of a simple close-up trick, Martin Gardner's Turnabout. A formalism called propositional dynamic logic is used to describe some of its logical aspects. This elucidates the nature and importance of the relationship between a trick's effect sequence and its method sequence, characterized by the careful arrangement of four evidence relationships: similarity, perceptual equivalence, structural equivalence, and congruence. The analysis further identifies two characteristics of magical apparatus that enable the construction of apparent impossibility: substitutable elements and stable occlusion. PMID:27378959
Some Practical Approaches to a Course on Paraconsistent Logic for Engineers
ERIC Educational Resources Information Center
Lambert-Torres, Germano; de Moraes, Carlos Henrique Valerio; Coutinho, Maurilio Pereira; Martins, Helga Gonzaga; Borges da Silva, Luiz Eduardo
2017-01-01
This paper describes a non-classical logic course primarily indicated for graduate students in electrical engineering and energy engineering. The content of this course is based on the vision that it is not enough for a student to indefinitely accumulate knowledge; it is necessary to explore all the occasions to update, deepen, and enrich that…
2015-05-21
complex adaptive systems, emergence, lateral thinking, bias, communication, buy-in, discourses, ecologies 16. SECURITY CLASSIFICATION OF: 17...consistent logic. Dr. Alice Butler-Smith dramatically opened my mind to understand the nature of design and the ecology of discourses. I’m also indebted to...perspectives. A lifelong student in the classical Greek style, Scipio possessed a highly lateral background in logic, ethics , astronomy, geometry
A discussion on the origin of quantum probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel
We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
A Survey of Logic Formalisms to Support Mishap Analysis
NASA Technical Reports Server (NTRS)
Johnson, Chris; Holloway, C. M.
2003-01-01
Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.
Temporal-logic analysis of microglial phenotypic conversion with exposure to amyloid-β.
Anastasio, Thomas J
2015-02-01
Alzheimer Disease (AD) remains a leading killer with no adequate treatment. Ongoing research increasingly implicates the brain's immune system as a critical contributor to AD pathogenesis, but the complexity of the immune contribution poses a barrier to understanding. Here I use temporal logic to analyze a computational specification of the immune component of AD. Temporal logic is an extension of logic to propositions expressed in terms of time. It has traditionally been used to analyze computational specifications of complex engineered systems but applications to complex biological systems are now appearing. The inflammatory component of AD involves the responses of microglia to the peptide amyloid-β (Aβ), which is an inflammatory stimulus and a likely causative AD agent. Temporal-logic analysis of the model provides explanations for the puzzling findings that Aβ induces an anti-inflammatory and well as a pro-inflammatory response, and that Aβ is phagocytized by microglia in young but not in old animals. To potentially explain the first puzzle, the model suggests that interferon-γ acts as an "autocrine bridge" over which an Aβ-induced increase in pro-inflammatory cytokines leads to an increase in anti-inflammatory mediators also. To potentially explain the second puzzle, the model identifies a potential instability in signaling via insulin-like growth factor 1 that could explain the failure of old microglia to phagocytize Aβ. The model predicts that augmentation of insulin-like growth factor 1 signaling, and activation of protein kinase C in particular, could move old microglia from a neurotoxic back toward a more neuroprotective and phagocytic phenotype.
Précis of bayesian rationality: The probabilistic approach to human reasoning.
Oaksford, Mike; Chater, Nick
2009-02-01
According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.
Mechanics as the Logical Point of Entry for the Enculturation into Scientific Thinking
ERIC Educational Resources Information Center
Carson, Robert; Rowlands, Stuart
2005-01-01
Force in modern classical mechanics is unique, both in terms of its logical character and the conceptual difficulties it causes. Force is well defined by a set of axioms that not only structures mechanics but science in general. Force is also the dominant theme in the "misconceptions" literature and many philosophers and physicists alike have…
Missing Data: Discovering the Private Logic of Adult-Wary Youth
ERIC Educational Resources Information Center
Seita, John
2010-01-01
In his classic book, "The Problem Child," Alfred Adler (1930) noted that if educators do not understand the "private logic" and goals of a young person, their interventions may do more harm than good. But it is not a natural process to empathize with persons who fight their well-intended efforts to help. Adults and young people are often pitted as…
Classical Pragmatism on Mind and Rationality
ERIC Educational Resources Information Center
Maattanen, Pentti
2005-01-01
One of the major changes in twentieth century philosophy was the so-called linguistic turn, in which natural and formal languages became central subjects of study. This meant that theories of meaning became mostly about linguistic meaning, thinking was now analyzed in terms of symbol manipulation, and rules of classical logic formed the nucleus of…
[Fuzzy logic in urology. How to reason in inaccurate terms].
Vírseda Chamorro, Miguel; Salinas Casado, Jesus; Vázquez Alba, David
2004-05-01
The Occidental thinking is basically binary, based on opposites. The classic logic constitutes a systematization of these thinking. The methods of pure sciences such as physics are based on systematic measurement, analysis and synthesis. Nature is described by deterministic differential equations this way. Medical knowledge does not adjust well to deterministic equations of physics so that probability methods are employed. However, this method is not free of problems, both theoretical and practical, so that it is not often possible even to know with certainty the probabilities of most events. On the other hand, the application of binary logic to medicine in general, and to urology particularly, finds serious difficulties such as the imprecise character of the definition of most diseases and the uncertainty associated with most medical acts. These are responsible for the fact that many medical recommendations are made using a literary language which is inaccurate, inconsistent and incoherent. The blurred logic is a way of reasoning coherently using inaccurate concepts. This logic was proposed by Lofti Zadeh in 1965 and it is based in two principles: the theory of blurred conjuncts and the use of blurred rules. A blurred conjunct is one the elements of which have a degree of belonging between 0 and 1. Each blurred conjunct is associated with an inaccurate property or linguistic variable. Blurred rules use the principles of classic logic adapted to blurred conjuncts taking the degree of belonging of each element to the blurred conjunct of reference as the value of truth. Blurred logic allows to do coherent urologic recommendations (i.e. what patient is the performance of PSA indicated in?, what to do in the face of an elevated PSA?), or to perform diagnosis adapted to the uncertainty of diagnostic tests (e.g. data obtained from pressure flow studies in females).
Creating Clinical Fuzzy Automata with Fuzzy Arden Syntax.
de Bruin, Jeroen S; Steltzer, Heinz; Rappelsberger, Andrea; Adlassnig, Klaus-Peter
2017-01-01
Formal constructs for fuzzy sets and fuzzy logic are incorporated into Arden Syntax version 2.9 (Fuzzy Arden Syntax). With fuzzy sets, the relationships between measured or observed data and linguistic terms are expressed as degrees of compatibility that model the unsharpness of the boundaries of linguistic terms. Propositional uncertainty due to incomplete knowledge of relationships between clinical linguistic concepts is modeled with fuzzy logic. Fuzzy Arden Syntax also supports the construction of fuzzy state monitors. The latter are defined as monitors that employ fuzzy automata to observe gradual transitions between different stages of disease. As a use case, we re-implemented FuzzyARDS, a previously published clinical monitoring system for patients suffering from acute respiratory distress syndrome (ARDS). Using the re-implementation as an example, we show how key concepts of fuzzy automata, i.e., fuzzy states and parallel fuzzy state transitions, can be implemented in Fuzzy Arden Syntax. The results showed that fuzzy state monitors can be implemented in a straightforward manner.
Learning and Reasoning in Unknown Domains
NASA Astrophysics Data System (ADS)
Strannegård, Claes; Nizamani, Abdul Rahim; Juel, Jonas; Persson, Ulf
2016-12-01
In the story Alice in Wonderland, Alice fell down a rabbit hole and suddenly found herself in a strange world called Wonderland. Alice gradually developed knowledge about Wonderland by observing, learning, and reasoning. In this paper we present the system Alice In Wonderland that operates analogously. As a theoretical basis of the system, we define several basic concepts of logic in a generalized setting, including the notions of domain, proof, consistency, soundness, completeness, decidability, and compositionality. We also prove some basic theorems about those generalized notions. Then we model Wonderland as an arbitrary symbolic domain and Alice as a cognitive architecture that learns autonomously by observing random streams of facts from Wonderland. Alice is able to reason by means of computations that use bounded cognitive resources. Moreover, Alice develops her belief set by continuously forming, testing, and revising hypotheses. The system can learn a wide class of symbolic domains and challenge average human problem solvers in such domains as propositional logic and elementary arithmetic.
Community science, philosophy of science, and the practice of research.
Tebes, Jacob Kraemer
2005-06-01
Embedded in community science are implicit theories on the nature of reality (ontology), the justification of knowledge claims (epistemology), and how knowledge is constructed (methodology). These implicit theories influence the conceptualization and practice of research, and open up or constrain its possibilities. The purpose of this paper is to make some of these theories explicit, trace their intellectual history, and propose a shift in the way research in the social and behavioral sciences, and community science in particular, is conceptualized and practiced. After describing the influence and decline of logical empiricism, the underlying philosophical framework for science for the past century, I summarize contemporary views in the philosophy of science that are alternatives to logical empiricism. These include contextualism, normative naturalism, and scientific realism, and propose that a modified version of contextualism, known as perspectivism, affords the philosophical framework for an emerging community science. I then discuss the implications of perspectivism for community science in the form of four propositions to guide the practice of research.
NASA Astrophysics Data System (ADS)
Basiladze, S. G.
2017-05-01
The paper describes the general physical theory of signals, carriers of information, which supplements Shannon's abstract classical theory and is applicable in much broader fields, including nuclear physics. It is shown that in the absence of classical noise its place should be taken by the physical threshold of signal perception for objects of both macrocosm and microcosm. The signal perception threshold allows the presence of subthreshold (virtual) signal states. For these states, Boolean algebra of logic ( A = 0/1) is transformed into the "algebraic logic" of probabilities (0 ≤ a ≤ 1). The similarity and difference of virtual states of macroand microsignals are elucidated. "Real" and "quantum" information for computers is considered briefly. The maximum information transmission rate is estimated based on physical constants.
Sainz de Murieta, Iñaki; Rodríguez-Patón, Alfonso
2012-08-01
Despite the many designs of devices operating with the DNA strand displacement, surprisingly none is explicitly devoted to the implementation of logical deductions. The present article introduces a new model of biosensor device that uses nucleic acid strands to encode simple rules such as "IF DNA_strand(1) is present THEN disease(A)" or "IF DNA_strand(1) AND DNA_strand(2) are present THEN disease(B)". Taking advantage of the strand displacement operation, our model makes these simple rules interact with input signals (either DNA or any type of RNA) to generate an output signal (in the form of nucleotide strands). This output signal represents a diagnosis, which either can be measured using FRET techniques, cascaded as the input of another logical deduction with different rules, or even be a drug that is administered in response to a set of symptoms. The encoding introduces an implicit error cancellation mechanism, which increases the system scalability enabling longer inference cascades with a bounded and controllable signal-noise relation. It also allows the same rule to be used in forward inference or backward inference, providing the option of validly outputting negated propositions (e.g. "diagnosis A excluded"). The models presented in this paper can be used to implement smart logical DNA devices that perform genetic diagnosis in vitro. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Tianyi; Gong, Feng; Lu, Anjiang; Zhang, Damin; Zhang, Zhengping
2017-12-01
In this paper, we propose a scheme that integrates quantum key distribution and private classical communication via continuous variables. The integrated scheme employs both quadratures of a weak coherent state, with encrypted bits encoded on the signs and Gaussian random numbers encoded on the values of the quadratures. The integration enables quantum and classical data to share the same physical and logical channel. Simulation results based on practical system parameters demonstrate that both classical communication and quantum communication can be implemented over distance of tens of kilometers, thus providing a potential solution for simultaneous transmission of quantum communication and classical communication.
The Quantum Logical Challenge: Peter Mittelstaedt's Contributions to Logic and Philosophy of Science
NASA Astrophysics Data System (ADS)
Beltrametti, E.; Dalla Chiara, M. L.; Giuntini, R.
2017-12-01
Peter Mittelstaedt's contributions to quantum logic and to the foundational problems of quantum theory have significantly realized the most authentic spirit of the International Quantum Structures Association: an original research about hard technical problems, which are often "entangled" with the emergence of important changes in our general world-conceptions. During a time where both the logical and the physical community often showed a skeptical attitude towards Birkhoff and von Neumann's quantum logic, Mittelstaedt brought into light the deeply innovating features of a quantum logical thinking that allows us to overcome some strong and unrealistic assumptions of classical logical arguments. Later on his intense research on the unsharp approach to quantum theory and to the measurement problem stimulated the increasing interest for unsharp forms of quantum logic, creating a fruitful interaction between the work of quantum logicians and of many-valued logicians. Mittelstaedt's general views about quantum logic and quantum theory seem to be inspired by a conjecture that is today more and more confirmed: there is something universal in the quantum theoretic formalism that goes beyond the limits of microphysics, giving rise to interesting applications to a number of different fields.
Modal Interpretation of Quantum Mechanics and Classical Physical Theories
NASA Astrophysics Data System (ADS)
Ingarden, R. S.
In 1990, Bas C. van Fraassen defined the modal interpretation of quantum mechanics as the consideration of it as ``a pure theory of the possible, with testable, empirical implications for what actually happens". This is a narrow, traditional understanding of modality, only in the sense of the concept of possibility (usually denoted in logic by the C. I. Lewis's symbol 3) and the concept of necessity 2 defined by means of 3. In modern logic, however, modality is understood in a much wider sense as any intensional functor (i.e. non-extensional or determined not only by the truth value of a sentence). In the recent (independent of van Fraassen) publications of the author (1997), an attempt was made to apply this wider understanding of modality to interpretation of classical and quantum physics. In the present lecture, these problems are discussed on the background of a brief review of the logical approch to quantum mechanics in the recent 7 decades. In this discussion, the new concepts of sub-modality and super-modality of many orders are used.
Hosseini Shokouh, Seyed Hossein; Raza, Syed Raza Ali; Lee, Hee Sung; Im, Seongil
2014-08-21
On a single ZnO nanowire (NW), we fabricated an inverter-type device comprising a Schottky diode (SD) and field-effect transistor (FET), aiming at 1-dimensional (1D) electronic circuits with low power consumption. The SD and adjacent FET worked respectively as the load and driver, so that voltage signals could be easily extracted as the output. In addition, NW FET with a transparent conducting oxide as top gate turned out to be very photosensitive, although ZnO NW SD was blind to visible light. Based on this, we could achieve an array of photo-inverter cells on one NW. Our non-classical inverter is regarded as quite practical for both logic and photo-sensing due to its performance as well as simple device configuration.
Spiteri, Jasmine M A; Mallia, Carl J; Scerri, Glenn J; Magri, David C
2017-12-06
A novel fluorescent molecular logic gate with a 'fluorophore-spacer 1 -receptor 1 -spacer 2 -receptor 2 ' format is demonstrated in 1 : 1 (v/v) methanol/water. The molecule consists of an anthracene fluorophore, and tertiary alkyl amine and N-(2-methoxyphenyl)aza-15-crown-5 ether receptors. In the presence of threshold concentrations of H + and Na + , the molecule switches 'on' as an AND logic gate with a fluorescence quantum yield of 0.21 with proton and sodium binding constants of log β H+ = 9.0 and log β Na+ = 3.2, respectively. At higher proton levels, protonation also occurs at the anilinic nitrogen atom ether with a log β H+ = 4.2, which allows for Na + , H + -enabled OR (OR + AND circuit) and H + -driven ternary logic functions. The reported molecule is compared and contrasted to classic anthracene-based Na + and H + logic gates. We propose that such logic-based molecules could be useful tools for probing the vicinity of Na + , H + antiporters in biological systems.
Quantum theory as the most robust description of reproducible experiments
NASA Astrophysics Data System (ADS)
De Raedt, Hans; Katsnelson, Mikhail I.; Michielsen, Kristel
2014-08-01
It is shown that the basic equations of quantum theory can be obtained from a straightforward application of logical inference to experiments for which there is uncertainty about individual events and for which the frequencies of the observed events are robust with respect to small changes in the conditions under which the experiments are carried out. There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature [45]. Physics is to be regarded not so much as the study of something a priori given, but rather as the development of methods of ordering and surveying human experience. In this respect our task must be to account for such experience in a manner independent of individual subjective judgment and therefore objective in the sense that it can be unambiguously communicated in ordinary human language [46]. The physical content of quantum mechanics is exhausted by its power to formulate statistical laws governing observations under conditions specified in plain language [46]. The first two sentences of the first quote may be read as a suggestion to dispose of, in Mermin's words [47], the "bad habit" to take mathematical abstractions as the reality of the events (in the everyday sense of the word) that we experience through our senses. Although widely circulated, these sentences are reported by Petersen [45] and there is doubt that Bohr actually used this wording [48]. The last two sentences of the first quote and the second quote suggest that we should try to describe human experiences (confined to the realm of scientific inquiry) in a manner and language which is unambiguous and independent of the individual subjective judgment. Of course, the latter should not be construed to imply that the observed phenomena are independent of the choices made by the individual(s) in performing the scientific experiment [49].The third quote suggests that quantum theory is a powerful language to describe a certain class of statistical experiments but remains vague about the properties of the class. Similar views were expressed by other fathers of quantum mechanics, e.g., Max Born and Wolfgang Pauli [50]. They can be summarized as "Quantum theory describes our knowledge of the atomic phenomena rather than the atomic phenomena themselves". Our aim is, in a sense, to replace the philosophical components of these statements by well-defined mathematical concepts and to carefully study their relevance for physical phenomena. Specifically, by applying the general formalism of logical inference to a well-defined class of statistical experiments, the present paper shows that quantum theory is indeed the kind of language envisaged by Bohr.Theories such as Newtonian mechanics, Maxwell's electrodynamics, and Einstein's (general) relativity are deductive in character. Starting from a few axioms, abstracted from experimental observations and additional assumptions about the irrelevance of a large number of factors for the description of the phenomena of interest, deductive reasoning is used to prove or disprove unambiguous statements, propositions, about the mathematical objects which appear in the theory.The method of deductive reasoning conforms to the Boolean algebra of propositions. The deductive, reductionist methodology has the appealing feature that one can be sure that the propositions are either right or wrong, and disregarding the possibility that some of the premises on which the deduction is built may not apply, there is no doubt that the conclusions are correct. Clearly, these theories successfully describe a wide range of physical phenomena in a manner and language which is unambiguous and independent of the individual.At the same time, the construction of a physical theory, and a scientific theory in general, from "first principles" is, for sure, not something self-evident, and not even safe. Our basic knowledge always starts from the middle, that is, from the world of macroscopic objects. According to Bohr, the quantum theoretical description crucially depends on the existence of macroscopic objects which can be used as measuring devices. For an extensive analysis of the quantum measurement process from a dynamical point of view see Ref. [51]. Most importantly, the description of the macroscopic level is robust, that is, essentially independent of the underlying "more fundamental" picture [2]. As will be seen later, formalizing the notion of "robustness" is key to derive the basic equations of quantum theory from the general framework of logical inference.Key assumptions of the deductive approach are that the mathematical description is a complete description of the experiment under consideration and that there is no uncertainty about the conditions under which the experiment is carried out. If the theory does not fully account for all the relevant aspects of the phenomenon that we wish to describe, the general rules by which we deduce whether a proposition is true or false can no longer be used. However, in these circumstances, we can still resort to logical inference [37-41] to find useful answers to unambiguous questions. Of course, in general it will no longer be possible to say whether a proposition is true or false, hence there will always remain a residue of doubt. However, as will be shown, the description obtained through logical inference may also be unambiguous and independent of the individual.In the present paper, we demonstrate that the basic equations of quantum theory directly follow from logical inference applied to experiments in which there is uncertainty about individual events, the stringent condition that certain properties of the collection of events are reproducible, meaning that they are robust with respect to small changes in the conditions under which the experiments are carried out.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donker, H.C., E-mail: h.donker@science.ru.nl; Katsnelson, M.I.; De Raedt, H.
2016-09-15
The logical inference approach to quantum theory, proposed earlier De Raedt et al. (2014), is considered in a relativistic setting. It is shown that the Klein–Gordon equation for a massive, charged, and spinless particle derives from the combination of the requirements that the space–time data collected by probing the particle is obtained from the most robust experiment and that on average, the classical relativistic equation of motion of a particle holds. - Highlights: • Logical inference applied to relativistic, massive, charged, and spinless particle experiments leads to the Klein–Gordon equation. • The relativistic Hamilton–Jacobi is scrutinized by employing a field description formore » the four-velocity. • Logical inference allows analysis of experiments with uncertainty in detection events and experimental conditions.« less
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-01-01
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-05-15
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity
NASA Astrophysics Data System (ADS)
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-05-01
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).
Causation and Validation of Nursing Diagnoses: A Middle Range Theory.
de Oliveira Lopes, Marcos Venícios; da Silva, Viviane Martins; Herdman, T Heather
2017-01-01
To describe a predictive middle range theory (MRT) that provides a process for validation and incorporation of nursing diagnoses in clinical practice. Literature review. The MRT includes definitions, a pictorial scheme, propositions, causal relationships, and translation to nursing practice. The MRT can be a useful alternative for education, research, and translation of this knowledge into practice. This MRT can assist clinicians in understanding clinical reasoning, based on temporal logic and spectral interaction among elements of nursing classifications. In turn, this understanding will improve the use and accuracy of nursing diagnosis, which is a critical component of the nursing process that forms a basis for nursing practice standards worldwide. © 2015 NANDA International, Inc.
NASA Astrophysics Data System (ADS)
Sakál, Peter; Hrdinová, Gabriela
2016-06-01
This article is the result of a conceptual design methodology for the development of a sustainable strategy of sustainable corporate social responsibility (SCSR) in the context of the HCS model 3E formed, as a co-author within the stated grants and dissertation. On the basis of the use of propositional logic, the SCSR procedure is proposed for incorporation into the corporate strategy of sustainable development and the integrated management system (IMS) of the industrial enterprise. The aim of this article is the proposal of the concept of development and implementation strategy of SCSR in the context of the HCS model 3E.
[Decision of mathematical logical tasks in sensory enriched environment (classical music)].
Pavlygina, R A; Karamysheva, N N; Tutushkina, M V; Sakharov, D S; Davydov, V I
2012-01-01
The time of a decision of mathematical logical tasks (MLT) was decreased during classical musical accompaniment (power 35 and 65 dB). Music 85 dB did not influence on the process of decision of MLT. Decision without the musical accompaniment led to increasing of coherent function values in beta1, beta2, gamma frequency ranges in EEG of occipital areas with prevalence in a left hemisphere. A coherence of potentials was decreased in EEG of frontal cortex. Music decreasing of making-decision time enhanced left-sided EEG asymmetry The intrahemispheric and the interhemispheric coherences of frontal cortex were increased during the decision of MLT accompanied by music. Using of musical accompaniment 85 dB produced a right-side asymmetry in EEG and formed a focus of coherent connections in EEG of temporal area of a right hemisphere.
ERIC Educational Resources Information Center
Shamsuddin, Salahuddin Mohd.; Ahmad, Siti Sara Binti Hj.
2017-01-01
Classical Arabic was originated in the family of Semitic languages as a result of mixing among the languages of the people who lived in the Arabian Peninsula. Nobody knows the exact time of its emergence. We had some knowledge by some stone monuments and oral histories indicated that some distinct languages were in the south and north of the…
Ultralow-voltage design of graphene PN junction quantum reflective switch transistor
NASA Astrophysics Data System (ADS)
Sohier, Thibault; Yu, Bin
2011-05-01
We propose the concept of a graphene-based quantum reflective switch (QRS) for low-power logic application. With the unique electronic properties of graphene, a tilted PN junction is used to implement logic switch function with 103 ON/OFF ratio. Carriers are reflected on an electrostatically induced potential step with strong incidence-angle-dependency due to the widening of classically forbidden energies. Optimized design of the device for ultralow-voltage operating has been conducted. The device is constantly ON with a turning-off gate voltage around 180 mV using thin HfO2 as the gate dielectric. The results suggest a class of logic switch devices operating with micropower dissipation.
PLQP & Company: Decidable Logics for Quantum Algorithms
NASA Astrophysics Data System (ADS)
Baltag, Alexandru; Bergfeld, Jort; Kishida, Kohei; Sack, Joshua; Smets, Sonja; Zhong, Shengyang
2014-10-01
We introduce a probabilistic modal (dynamic-epistemic) quantum logic PLQP for reasoning about quantum algorithms. We illustrate its expressivity by using it to encode the correctness of the well-known quantum search algorithm, as well as of a quantum protocol known to solve one of the paradigmatic tasks from classical distributed computing (the leader election problem). We also provide a general method (extending an idea employed in the decidability proof in Dunn et al. (J. Symb. Log. 70:353-359, 2005)) for proving the decidability of a range of quantum logics, interpreted on finite-dimensional Hilbert spaces. We give general conditions for the applicability of this method, and in particular we apply it to prove the decidability of PLQP.
Unimolecular Logic Gate with Classical Input by Single Gold Atoms.
Skidin, Dmitry; Faizy, Omid; Krüger, Justus; Eisenhut, Frank; Jancarik, Andrej; Nguyen, Khanh-Hung; Cuniberti, Gianaurelio; Gourdon, Andre; Moresco, Francesca; Joachim, Christian
2018-02-27
By a combination of solution and on-surface chemistry, we synthesized an asymmetric starphene molecule with two long anthracenyl input branches and a short naphthyl output branch on the Au(111) surface. Starting from this molecule, we could demonstrate the working principle of a single molecule NAND logic gate by selectively contacting single gold atoms by atomic manipulation to the longer branches of the molecule. The logical input "1" ("0") is defined by the interaction (noninteraction) of a gold atom with one of the input branches. The output is measured by scanning tunneling spectroscopy following the shift in energy of the electronic tunneling resonances at the end of the short branch of the molecule.
Optimization of topological quantum algorithms using Lattice Surgery is hard
NASA Astrophysics Data System (ADS)
Herr, Daniel; Nori, Franco; Devitt, Simon
The traditional method for computation in the surface code or the Raussendorf model is the creation of holes or ''defects'' within the encoded lattice of qubits which are manipulated via topological braiding to enact logic gates. However, this is not the only way to achieve universal, fault-tolerant computation. In this work we turn attention to the Lattice Surgery representation, which realizes encoded logic operations without destroying the intrinsic 2D nearest-neighbor interactions sufficient for braided based logic and achieves universality without using defects for encoding information. In both braided and lattice surgery logic there are open questions regarding the compilation and resource optimization of quantum circuits. Optimization in braid-based logic is proving to be difficult to define and the classical complexity associated with this problem has yet to be determined. In the context of lattice surgery based logic, we can introduce an optimality condition, which corresponds to a circuit with lowest amount of physical qubit requirements, and prove that the complexity of optimizing the geometric (lattice surgery) representation of a quantum circuit is NP-hard.
The nature of advanced reasoning and science instruction
NASA Astrophysics Data System (ADS)
Lawson, Anton E.
Although the development of reasoning is recognized as an important goal of science instruction, its nature remains somewhat of a mystery. This article discusses two key questions: Does formal thought constitute a structured whole? And what role does propositional logic play in advanced reasoning? Aspects of a model of advanced reasoning are presented in which hypothesis generation and testing are viewed as central processes in intellectual development. It is argued that a number of important advanced reasoning schemata are linked by these processes and should be made a part of science instruction designed to improve students' reasoning abilities.Concerning students' development and use of formal reasoning, Linn (1982) calls for research into practical issues such as the roles of task-specific knowledge and individual differences in performance, roles not emphasized by Piaget in his theory and research. From a science teacher's point of view, this is good advice. Accordingly, this article will expand upon some of the issues raised by Linn in a discussion of the nature of advanced reasoning which attempts to reconcile the apparent contradiction between students' differential use of advanced reasoning schemata in varying contexts with the notion of a general stage of formal thought. Two key questions will be discussed: Does formal thought constitute a structured whole? And what role does propositional logic play in advanced reasoning? The underlying assumption of the present discussion is that, among other things, science instruction should concern itself with the improvement of students' reasoning abilities (cf. Arons, 1976; Arons & Karplus, 1976; Bady, 1979; Bauman, 1976; Educational Policies Commission, 1966; Herron, 1978; Karplus, 1979; Kohlberg & Mayer, 1972; Moshman & Thompson, 1981; Lawson, 1979; Levine & linn, 1977; Pallrand, 1977; Renner & Lawson, 1973; Sayre & Ball, 1975; Schneider & Renner, 1980; Wollman, 1978). The questions are of interest because to date they lack clear answers, yet clear answers are necessary if we hope to design effective instruction in reasoning.
21 CFR 866.5320 - Properdin factor B immuno-logical test system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... involvement of the alternative to the classical pathway of activation of complement (a group of plasma... the skin). Other diseases in which the alternate pathway of complement activation has been implicated...
21 CFR 866.5320 - Properdin factor B immuno-logical test system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... involvement of the alternative to the classical pathway of activation of complement (a group of plasma... the skin). Other diseases in which the alternate pathway of complement activation has been implicated...
21 CFR 866.5320 - Properdin factor B immuno-logical test system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... involvement of the alternative to the classical pathway of activation of complement (a group of plasma... the skin). Other diseases in which the alternate pathway of complement activation has been implicated...
Managing expectations: cognitive authority and experienced control in complex healthcare processes.
Hunt, Katherine J; May, Carl R
2017-07-05
Balancing the normative expectations of others (accountabilities) against the personal and distributed resources available to meet them (capacity) is a ubiquitous feature of social relations in many settings. This is an important problem in the management of long-term conditions, because of widespread problems of non-adherence to treatment regimens. Using long-term conditions as an example, we set out middle range theory of this balancing work. A middle-range theory was constructed four stages. First, a qualitative elicitation study of men with heart failure was used to develop general propositions about patient and care giver experience, and about the ways that the organisation and delivery of care affected this. Second, these propositions were developed and confirmed through a systematic review of qualitative research literature. Third, theoretical propositions and constructs were built, refined and presented as a logic model associated with two main theoretical propositions. Finally, a construct validation exercise was undertaken, in which construct definitions informed reanalysis of a set of systematic reviews of studies of patient and caregiver experiences of heart failure that had been included in an earlier meta-review. Cognitive Authority Theory identifies, characterises and explains negotiation processes in in which people manage their relations with the expectations of normative systems - like those encountered in the management of long-term conditions. Here, their cognitive authority is the product of an assessment of competence, trustworthiness and credibility made about a person by other participants in a healthcare process; and their experienced control is a function of the degree to which they successfully manage the external process-specific limiting factors that make it difficult to otherwise perform in their role. Cognitive Authority Theory assists in explaining how participants in complex social processes manage important relational aspects of inequalities in power and expertise. It can play an important part in understanding the dynamics of participation in healthcare processes. It suggests ways in which these burdens may lead to relationally induced non-adherence to treatment regimens and self-care programmes, and points to targets where intervention may reduce these adverse outcomes.
Retro-causation, Minimum Contradictions and Non-locality
NASA Astrophysics Data System (ADS)
Kafatos, Menas; Nassikas, Athanassios A.
2011-11-01
Retro-causation has been experimentally verified by Bem and proposed by Kafatos in the form of space-time non-locality in the quantum framework. Every theory includes, beyond its specific axioms, the principles of logical communication (logical language), through which it is defined. This communication obeys the Aristotelian logic (Classical Logic), the Leibniz Sufficient Reason Principle, and a hidden axiom, which basically states that there is anterior-posterior relationship everywhere in communication. By means of a theorem discussed here, it can be proved that the communication mentioned implies contradictory statements, which can only be transcended through silence, i.e. the absence of any statements. Moreover, the breaking of silence is meaningful through the claim for minimum contradictions, which implies the existence of both a logical and an illogical dimension; contradictions refer to causality, implying its opposite, namely retro-causation, and the anterior posterior axiom, implying space-time non-locality. The purpose of this paper is to outline a framework accounting for retro-causation, through both purely theoretical and reality based points of view.
Pattern formation, logistics, and maximum path probability
NASA Astrophysics Data System (ADS)
Kirkaldy, J. S.
1985-05-01
The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are sufficiently strong interpretations of the second law of thermodynamics to define the approach to and the nature of patterned stable steady states. For many pattern-forming systems these principles define quantifiable stable states as maxima or minima (or both) in the dissipation. An elementary statistical-mechanical proof is offered. To turn the argument full circle, the transformations of the partitions and classes which are predicated upon such minimax entropic paths can through digital modeling be directly identified with the syntactic and inferential elements of deductive logic. It follows therefore that all self-organizing or pattern-forming systems which possess stable steady states approach these states according to the imperatives of formal logic, the optimum pattern with its rich endowment of equivalence relations representing the central theorem of the associated calculus. Logic is thus ``the stuff of the universe,'' and biological evolution with its culmination in the human brain is the most significant example of all the irreversible pattern-forming processes. We thus conclude with a few remarks on the relevance of the contribution to the theory of evolution and to research on artificial intelligence.
Kasturirangan, Rajesh
2008-01-01
Philosophers as well lay people often think of beliefs as psychological states with dubious epistemic properties. Beliefs are conceptualized as unregulated conceptual structures, for the most part hypothetical and often fanciful or deluded. Thinking and reasoning on the other hand are seen as rational activities regulated by rules and governed by norms. Computational modeling of the mind has focused on rule-governed behavior, ultimately trying to reduce them to rules of logic. What if thinking is less like reasoning and more like believing? I argue that the classical model of thought as rational is mistaken and that thinking is fundamentally constituted by believing. This new approach forces us to re-evaluate classical epistemic concepts like "truth", "justification" etc. Furthermore, if thinking is believing, then it is not clear how thoughts can be modeled computationally. We need new mathematical ideas to model thought, ideas that are quite different from traditional logic-based mathematical structures.
'Swapna' in the Indian classics: Mythology or science?
Tendulkar, Sonali S; Dwivedi, R R
2010-04-01
There are many concepts in Ayurveda as well as the ancient sciences that are untouched or unexplored. One such concept is that of the Swapna (dreams). Being an abstract phenomenon it makes it difficult to be explained and understood; probably because of this the descriptions related to Swapna in the Indian classics are supported by mythology, to make them acceptable. Variations in these explanations are seen according to the objective of the school of thought; that is, in the ancient texts where dreams are used to delve into the knowledge of the Atman and are related to spirituality, its description in the Ayurvedic texts evolves around the Sharira and Manas. Although all these explanations seem to be shrouded in uncertainty and mythology; there definitely seems to be a logical and rational science behind these quotations. They only need research, investigation, and explanation on the basis of logic, and a laboratory.
‘Swapna’ in the Indian classics: Mythology or science?
Tendulkar, Sonali S.; Dwivedi, R. R.
2010-01-01
There are many concepts in Ayurveda as well as the ancient sciences that are untouched or unexplored. One such concept is that of the Swapna (dreams). Being an abstract phenomenon it makes it difficult to be explained and understood; probably because of this the descriptions related to Swapna in the Indian classics are supported by mythology, to make them acceptable. Variations in these explanations are seen according to the objective of the school of thought; that is, in the ancient texts where dreams are used to delve into the knowledge of the Atman and are related to spirituality, its description in the Ayurvedic texts evolves around the Sharira and Manas. Although all these explanations seem to be shrouded in uncertainty and mythology; there definitely seems to be a logical and rational science behind these quotations. They only need research, investigation, and explanation on the basis of logic, and a laboratory. PMID:22131706
Knauff, Markus; Budeck, Claudia; Wolf, Ann G; Hamburger, Kai
2010-10-18
Explanations for the current worldwide financial crisis are primarily provided by economists and politicians. However, in the present work we focus on the psychological-cognitive factors that most likely affect the thinking of people on the economic stage and thus might also have had an effect on the progression of the crises. One of these factors might be the effect of prior beliefs on reasoning and decision-making. So far, this question has been explored only to a limited extent. We report two experiments on logical reasoning competences of nineteen stock-brokers with long-lasting vocational experiences at the stock market. The premises of reasoning problems concerned stock trading and the experiments varied whether or not their conclusions--a proposition which is reached after considering the premises--agreed with the brokers' prior beliefs. Half of the problems had a conclusion that was highly plausible for stock-brokers while the other half had a highly implausible conclusion. The data show a strong belief bias. Stock-brokers were strongly biased by their prior knowledge. Lowest performance was found for inferences in which the problems caused a conflict between logical validity and the experts' belief. In these cases, the stock-brokers tended to make logically invalid inferences rather than give up their existing beliefs. Our findings support the thesis that cognitive factors have an effect on the decision-making on the financial market. In the present study, stock-brokers were guided more by past experience and existing beliefs than by logical thinking and rational decision-making. They had difficulties to disengage themselves from vastly anchored thinking patterns. However, we believe, that it is wrong to accuse the brokers for their "malfunctions", because such hard-wired cognitive principles are difficult to suppress even if the person is aware of them.
NASA Technical Reports Server (NTRS)
Trotter, J. D.
1982-01-01
The Mosaic Transistor Array is an extension of the STAR system developed by NASA which has dedicated field cells designed to be specifically used in semicustom microprocessor applications. The Sandia radiation hard bulk CMOS process is utilized in order to satisfy the requirements of space flights. A design philosophy is developed which utilizes the strengths and recognizes the weaknesses of the Sandia process. A style of circuitry is developed which incorporates the low power and high drive capability of CMOS. In addition the density achieved is better than that for classic CMOS, although not as good as for NMOS. The basic logic functions for a data path are designed with compatible interface to the STAR grid system. In this manner either random logic or PLA type structures can be utilized for the control logic.
Assessing Pavlov's impact on the American conditioning enterprise.
Coleman, S R
1988-01-01
In this article, the visibility of Pavlov and of Watson in American psychology are compared, and the periods of their respective influence are specified with greater precision than is afforded by merely impressionistic methods. The author also critically examines the possibility that the early history of the American classic-conditioning enterprise involved a succession of two phases: a Watsonian/speculative phase and a Pavlovian/empirical phase. In conclusion, the author assesses the possibility that the publication of Pavlov's Conditioned Reflexes (1927) "stimulated" scholarly work on Pavlovian conditioning, and finds this proposition lacking empirical support.
Approximate Reasoning: Past, Present, Future
1990-06-27
This note presents a personal view of the state of the art in the representation and manipulation of imprecise and uncertain information by automated ... processing systems. To contrast their objectives and characteristics with the sound deductive procedures of classical logic, methodologies developed
Zeng, Qiang; Li, Tao; Song, Xinbing; Zhang, Xiangdong
2016-04-18
We propose and experimentally demonstrate an optimized setup to implement quantum controlled-NOT operation using polarization and orbital angular momentum qubits. This device is more adaptive to inputs with various polarizations, and can work both in classical and quantum single-photon regime. The logic operations performed by such a setup not only possess high stability and polarization-free character, they can also be easily extended to deal with multi-qubit input states. As an example, the experimental implementation of generalized three-qubit Toffoli gate has been presented.
NASA Astrophysics Data System (ADS)
Resconi, Germano; Klir, George J.; Pessa, Eliano
Recognizing that syntactic and semantic structures of classical logic are not sufficient to understand the meaning of quantum phenomena, we propose in this paper a new interpretation of quantum mechanics based on evidence theory. The connection between these two theories is obtained through a new language, quantum set theory, built on a suggestion by J. Bell. Further, we give a modal logic interpretation of quantum mechanics and quantum set theory by using Kripke's semantics of modal logic based on the concept of possible worlds. This is grounded on previous work of a number of researchers (Resconi, Klir, Harmanec) who showed how to represent evidence theory and other uncertainty theories in terms of modal logic. Moreover, we also propose a reformulation of the many-worlds interpretation of quantum mechanics in terms of Kripke's semantics. We thus show how three different theories — quantum mechanics, evidence theory, and modal logic — are interrelated. This opens, on one hand, the way to new applications of quantum mechanics within domains different from the traditional ones, and, on the other hand, the possibility of building new generalizations of quantum mechanics itself.
Sarkar, Sahotra
2015-10-01
This paper attempts a critical reappraisal of Nagel's (1961, 1970) model of reduction taking into account both traditional criticisms and recent defenses. This model treats reduction as a type of explanation in which a reduced theory is explained by a reducing theory after their relevant representational items have been suitably connected. In accordance with the deductive-nomological model, the explanation is supposed to consist of a logical deduction. Nagel was a pluralist about both the logical form of the connections between the reduced and reducing theories (which could be conditionals or biconditionals) and their epistemological status (as analytic connections, conventions, or synthetic claims). This paper defends Nagel's pluralism on both counts and, in the process, argues that the multiple realizability objection to reductionism is misplaced. It also argues that the Nagel model correctly characterizes reduction as a type of explanation. However, it notes that logical deduction must be replaced by a broader class of inferential techniques that allow for different types of approximation. Whereas Nagel (1970), in contrast to his earlier position (1961), recognized the relevance of approximation, he did not realize its full import for the model. Throughout the paper two case studies are used to illustrate the arguments: the putative reduction of classical thermodynamics to the kinetic theory of matter and that of classical genetics to molecular biology. Copyright © 2015. Published by Elsevier Ltd.
Optimization of lattice surgery is NP-hard
NASA Astrophysics Data System (ADS)
Herr, Daniel; Nori, Franco; Devitt, Simon J.
2017-09-01
The traditional method for computation in either the surface code or in the Raussendorf model is the creation of holes or "defects" within the encoded lattice of qubits that are manipulated via topological braiding to enact logic gates. However, this is not the only way to achieve universal, fault-tolerant computation. In this work, we focus on the lattice surgery representation, which realizes transversal logic operations without destroying the intrinsic 2D nearest-neighbor properties of the braid-based surface code and achieves universality without defects and braid-based logic. For both techniques there are open questions regarding the compilation and resource optimization of quantum circuits. Optimization in braid-based logic is proving to be difficult and the classical complexity associated with this problem has yet to be determined. In the context of lattice-surgery-based logic, we can introduce an optimality condition, which corresponds to a circuit with the lowest resource requirements in terms of physical qubits and computational time, and prove that the complexity of optimizing a quantum circuit in the lattice surgery model is NP-hard.
Detection of epistatic effects with logic regression and a classical linear regression model.
Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata
2014-02-01
To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.
Fuzzy logic and causal reasoning with an 'n' of 1 for diagnosis and treatment of the stroke patient.
Helgason, Cathy M; Jobe, Thomas H
2004-03-01
The current scientific model for clinical decision-making is founded on binary or Aristotelian logic, classical set theory and probability-based statistics. Evidence-based medicine has been established as the basis for clinical recommendations. There is a problem with this scientific model when the physician must diagnose and treat the individual patient. The problem is a paradox, which is that the scientific model of evidence-based medicine is based upon a hypothesis aimed at the group and therefore, any conclusions cannot be extrapolated but to a degree to the individual patient. This extrapolation is dependent upon the expertise of the physician. A fuzzy logic multivalued-based scientific model allows this expertise to be numerically represented and solves the clinical paradox of evidence-based medicine.
Some New Sets of Sequences of Fuzzy Numbers with Respect to the Partial Metric
Ozluk, Muharrem
2015-01-01
In this paper, we essentially deal with Köthe-Toeplitz duals of fuzzy level sets defined using a partial metric. Since the utilization of Zadeh's extension principle is quite difficult in practice, we prefer the idea of level sets in order to construct some classical notions. In this paper, we present the sets of bounded, convergent, and null series and the set of sequences of bounded variation of fuzzy level sets, based on the partial metric. We examine the relationships between these sets and their classical forms and give some properties including definitions, propositions, and various kinds of partial metric spaces of fuzzy level sets. Furthermore, we study some of their properties like completeness and duality. Finally, we obtain the Köthe-Toeplitz duals of fuzzy level sets with respect to the partial metric based on a partial ordering. PMID:25695102
Davis, S
1996-03-01
This paper points to a convergence of formal and rhetorical features in ancient Chinese cosmobiological theory, within which is developed a view of the inner life of human emotions. Inasmuch as there is an extensive classical tradition considering the emotions in conjunction with music, one can justify a structural analysis of medical texts treating disorder in emotional life, since emotions, musical interpretation and structural analysis all deal with systems interrelated in a transformational space largely independent of objective reference and propositional coordination. Following a section of ethnolinguistic sketches to provide grounds in some phenomenological worlds recognized by Chinese people, there is a textual analysis of a classical medical source for the treatment of emotional distress. Through close examination of the compositional schema of this text, it can be demonstrated that the standard categories of correlative cosmology are arrayed within a more comprehensive structural order.
Kompa, K L; Levine, R D
2001-01-16
We propose a scheme for molecule-based information processing by combining well-studied spectroscopic techniques and recent results from chemical dynamics. Specifically it is discussed how optical transitions in single molecules can be used to rapidly perform classical (Boolean) logical operations. In the proposed way, a restricted number of states in a single molecule can act as a logical gate equivalent to at least two switches. It is argued that the four-level scheme can also be used to produce gain, because it allows an inversion, and not only a switching ability. The proposed scheme is quantum mechanical in that it takes advantage of the discrete nature of the energy levels but, we here discuss the temporal evolution, with the use of the populations only. On a longer time range we suggest that the same scheme could be extended to perform quantum logic, and a tentative suggestion, based on an available experiment, is discussed. We believe that the pumping can provide a partial proof of principle, although this and similar experiments were not interpreted thus far in our terms.
Quantum Enhanced Inference in Markov Logic Networks
NASA Astrophysics Data System (ADS)
Wittek, Peter; Gogolin, Christian
2017-04-01
Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.
Quantum Enhanced Inference in Markov Logic Networks.
Wittek, Peter; Gogolin, Christian
2017-04-19
Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.
Quantum Enhanced Inference in Markov Logic Networks
Wittek, Peter; Gogolin, Christian
2017-01-01
Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning. PMID:28422093
ERIC Educational Resources Information Center
Debbasch, F.
2011-01-01
The logical structure of classical thermodynamics is presented in a modern, geometrical manner. The first and second law receive clear, operatively oriented statements and the Gibbs free energy extremum principle is fully discussed. Applications relevant to chemistry, such as phase transitions, dilute solutions theory and, in particular, the law…
Quantum Logic Networks for Probabilistic and Controlled Teleportation of Unknown Quantum States
NASA Astrophysics Data System (ADS)
Gao, Ting
2004-08-01
We present simplification schemes for probabilistic and controlled teleportation of the unknown quantum states of both one particle and two particles and construct efficient quantum logic networks for implementing the new schemes by means of the primitive operations consisting of single-qubit gates, two-qubit controlled-not gates, Von Neumann measurement, and classically controlled operations. In these schemes the teleportation are not always successful but with certain probability. The project supported by National Natural Science Foundation of China under Grant No. 10271081 and the Natural Science Foundation of Hebei Province of China under Grant No. A2004000141
Experimental quantum computing to solve systems of linear equations.
Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei
2013-06-07
Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.
On Some Assumptions of the Null Hypothesis Statistical Testing
ERIC Educational Resources Information Center
Patriota, Alexandre Galvão
2017-01-01
Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…
A psychometric evaluation of the digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-10-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.
NASA Astrophysics Data System (ADS)
Ding, Shulin; Wang, Guo Ping
2015-09-01
Classical nonlinear or quantum all-optical transistors are dependent on the value of input signal intensity or need extra co-propagating beams. In this paper, we present a kind of all-optical transistors constructed with parity-time (PT)-symmetric Y-junctions, which perform independently on the value of signal intensity in an unsaturated gain case and can also work after introducing saturated gain. Further, we show that control signal can switch the device from amplification of peaks in time to transformation of peaks to amplified troughs. By using these PT-symmetric Y-junctions with currently available materials and technologies, we can implement interesting logic functions such as NOT and XOR (exclusive OR) gates, implying potential applications of such structures in designing optical logic gates, optical switches, and signal transformations or amplifications.
An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty
Langlotz, Curtis P.; Shortliffe, Edward H.
1988-01-01
Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.
The knowledge instinct, cognitive algorithms, modeling of language and cultural evolution
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.
2008-04-01
The talk discusses mechanisms of the mind and their engineering applications. The past attempts at designing "intelligent systems" encountered mathematical difficulties related to algorithmic complexity. The culprit turned out to be logic, which in one way or another was used not only in logic rule systems, but also in statistical, neural, and fuzzy systems. Algorithmic complexity is related to Godel's theory, a most fundamental mathematical result. These difficulties were overcome by replacing logic with a dynamic process "from vague to crisp," dynamic logic. It leads to algorithms overcoming combinatorial complexity, and resulting in orders of magnitude improvement in classical problems of detection, tracking, fusion, and prediction in noise. I present engineering applications to pattern recognition, detection, tracking, fusion, financial predictions, and Internet search engines. Mathematical and engineering efficiency of dynamic logic can also be understood as cognitive algorithm, which describes fundamental property of the mind, the knowledge instinct responsible for all our higher cognitive functions: concepts, perception, cognition, instincts, imaginations, intuitions, emotions, including emotions of the beautiful. I present our latest results in modeling evolution of languages and cultures, their interactions in these processes, and role of music in cultural evolution. Experimental data is presented that support the theory. Future directions are outlined.
Multi-server blind quantum computation over collective-noise channels
NASA Astrophysics Data System (ADS)
Xiao, Min; Liu, Lin; Song, Xiuli
2018-03-01
Blind quantum computation (BQC) enables ordinary clients to securely outsource their computation task to costly quantum servers. Besides two essential properties, namely correctness and blindness, practical BQC protocols also should make clients as classical as possible and tolerate faults from nonideal quantum channel. In this paper, using logical Bell states as quantum resource, we propose multi-server BQC protocols over collective-dephasing noise channel and collective-rotation noise channel, respectively. The proposed protocols permit completely or almost classical client, meet the correctness and blindness requirements of BQC protocol, and are typically practical BQC protocols.
Cellular Automata Generalized To An Inferential System
NASA Astrophysics Data System (ADS)
Blower, David J.
2007-11-01
Stephen Wolfram popularized elementary one-dimensional cellular automata in his book, A New Kind of Science. Among many remarkable things, he proved that one of these cellular automata was a Universal Turing Machine. Such cellular automata can be interpreted in a different way by viewing them within the context of the formal manipulation rules from probability theory. Bayes's Theorem is the most famous of such formal rules. As a prelude, we recapitulate Jaynes's presentation of how probability theory generalizes classical logic using modus ponens as the canonical example. We emphasize the important conceptual standing of Boolean Algebra for the formal rules of probability manipulation and give an alternative demonstration augmenting and complementing Jaynes's derivation. We show the complementary roles played in arguments of this kind by Bayes's Theorem and joint probability tables. A good explanation for all of this is afforded by the expansion of any particular logic function via the disjunctive normal form (DNF). The DNF expansion is a useful heuristic emphasized in this exposition because such expansions point out where relevant 0s should be placed in the joint probability tables for logic functions involving any number of variables. It then becomes a straightforward exercise to rely on Boolean Algebra, Bayes's Theorem, and joint probability tables in extrapolating to Wolfram's cellular automata. Cellular automata are seen as purely deductive systems, just like classical logic, which probability theory is then able to generalize. Thus, any uncertainties which we might like to introduce into the discussion about cellular automata are handled with ease via the familiar inferential path. Most importantly, the difficult problem of predicting what cellular automata will do in the far future is treated like any inferential prediction problem.
Towards a Theory of Semantic Communication (Extended Technical Report)
2011-03-01
counting models of a sentence, when interpretations have different probabilities, what matters is the total probability of models of the sentence, not...of classic logics still hold in the LP semantics, e.g., De Morgan’s laws. However, modus pollens does hold in the LP semantics 10 F. Relation to
The European University in the Context of Logic of Integration
ERIC Educational Resources Information Center
Kotlyarov, Igor V.; Kostjukevich, Svetlana V.
2011-01-01
Many contemporary historians debate and ponder whether modern universities represent a unique creation of the high Middle Ages in Europe, or a simple evolution of schools and academies of classical antiquity. Policymakers, though, could also benefit from addressing this question because in order to find appropriate solutions to reforming…
Q & A with Ed Tech Leaders: Interview with J. Michael Spector
ERIC Educational Resources Information Center
Shaughnessy, Michael F.; Fulgham, Susan M.
2015-01-01
J. Michael Spector's academic preparation was in philosophy--epistemology and logic, primarily. His dissertation was on skepticism in modern philosophy, and that led him to a deep-seated appreciation for classical skepticism. The word "skeptic" is derived from the Greek word "skepsis," which means investigation. While the…
Causal Superlearning Arising from Interactions Among Cues
Urushihara, Kouji; Miller, Ralph R.
2017-01-01
Superconditioning refers to supernormal responding to a conditioned stimulus (CS) that sometimes occurs in classical conditioning when the CS is paired with an unconditioned stimulus (US) in the presence of a conditioned inhibitor for that US. In the present research, we conducted four experiments to investigate causal superlearning, a phenomenon in human causal learning analogous to superconditioning. Experiment 1 demonstrated superlearning relative to appropriate control conditions. Experiment 2 showed that superlearning wanes when the number of cues used in an experiment is relatively large. Experiment 3 determined that even when relatively many cues are used, superlearning can be observed provided testing is conducted immediately after training, which is problematic for explanations by most contemporary learning theories. Experiment 4 found that ratings of a superlearning cue are weaker than those to the training excitor which gives basis to the conditioned inhibitor-like causal preventor used during causal superlearning training. This is inconsistent with the prediction by propositional reasoning accounts of causal cue competition, but is readily explained by associative learning models. In sum, the current experiments revealed some weaknesses of both the associative and propositional reasoning models with respect to causal superlearning. PMID:28383940
Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis
NASA Astrophysics Data System (ADS)
Wang, M.; Hu, N. Q.; Qin, G. J.
2011-07-01
In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Shulin; Wang, Guo Ping, E-mail: gpwang@szu.edu.cn; College of Electronic Science and Technology, Shenzhen University, Shenzhen 518060
Classical nonlinear or quantum all-optical transistors are dependent on the value of input signal intensity or need extra co-propagating beams. In this paper, we present a kind of all-optical transistors constructed with parity-time (PT)-symmetric Y-junctions, which perform independently on the value of signal intensity in an unsaturated gain case and can also work after introducing saturated gain. Further, we show that control signal can switch the device from amplification of peaks in time to transformation of peaks to amplified troughs. By using these PT-symmetric Y-junctions with currently available materials and technologies, we can implement interesting logic functions such as NOTmore » and XOR (exclusive OR) gates, implying potential applications of such structures in designing optical logic gates, optical switches, and signal transformations or amplifications.« less
Active matter logic for autonomous microfluidics
NASA Astrophysics Data System (ADS)
Woodhouse, Francis G.; Dunkel, Jörn
2017-04-01
Chemically or optically powered active matter plays an increasingly important role in materials design, but its computational potential has yet to be explored systematically. The competition between energy consumption and dissipation imposes stringent physical constraints on the information transport in active flow networks, facilitating global optimization strategies that are not well understood. Here, we combine insights from recent microbial experiments with concepts from lattice-field theory and non-equilibrium statistical mechanics to introduce a generic theoretical framework for active matter logic. Highlighting conceptual differences with classical and quantum computation, we demonstrate how the inherent non-locality of incompressible active flow networks can be utilized to construct universal logical operations, Fredkin gates and memory storage in set-reset latches through the synchronized self-organization of many individual network components. Our work lays the conceptual foundation for developing autonomous microfluidic transport devices driven by bacterial fluids, active liquid crystals or chemically engineered motile colloids.
A Fuzzy Description Logic with Automatic Object Membership Measurement
NASA Astrophysics Data System (ADS)
Cai, Yi; Leung, Ho-Fung
In this paper, we propose a fuzzy description logic named f om -DL by combining the classical view in cognitive psychology and fuzzy set theory. A formal mechanism used to determine object memberships automatically in concepts is also proposed, which is lacked in previous work fuzzy description logics. In this mechanism, object membership is based on the defining properties of concept definition and properties in object description. Moreover, while previous works cannot express the qualitative measurements of an object possessing a property, we introduce two kinds of properties named N-property and L-property, which are quantitative measurements and qualitative measurements of an object possessing a property respectively. The subsumption and implication of concepts and properties are also explored in our work. We believe that it is useful to the Semantic Web community for reasoning the fuzzy membership of objects for concepts in fuzzy ontologies.
Minimally inconsistent reasoning in Semantic Web.
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.
Minimally inconsistent reasoning in Semantic Web
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030
Kompa, K. L.; Levine, R. D.
2001-01-01
We propose a scheme for molecule-based information processing by combining well-studied spectroscopic techniques and recent results from chemical dynamics. Specifically it is discussed how optical transitions in single molecules can be used to rapidly perform classical (Boolean) logical operations. In the proposed way, a restricted number of states in a single molecule can act as a logical gate equivalent to at least two switches. It is argued that the four-level scheme can also be used to produce gain, because it allows an inversion, and not only a switching ability. The proposed scheme is quantum mechanical in that it takes advantage of the discrete nature of the energy levels but, we here discuss the temporal evolution, with the use of the populations only. On a longer time range we suggest that the same scheme could be extended to perform quantum logic, and a tentative suggestion, based on an available experiment, is discussed. We believe that the pumping can provide a partial proof of principle, although this and similar experiments were not interpreted thus far in our terms. PMID:11209046
NASA Astrophysics Data System (ADS)
Baumeler, ńmin; Feix, Adrien; Wolf, Stefan
2014-10-01
Quantum theory in a global spacetime gives rise to nonlocal correlations, which cannot be explained causally in a satisfactory way; this motivates the study of theories with reduced global assumptions. Oreshkov, Costa, and Brukner [Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076] proposed a framework in which quantum theory is valid locally but where, at the same time, no global spacetime, i.e., predefined causal order, is assumed beyond the absence of logical paradoxes. It was shown for the two-party case, however, that a global causal order always emerges in the classical limit. Quite naturally, it has been conjectured that the same also holds in the multiparty setting. We show that, counter to this belief, classical correlations locally compatible with classical probability theory exist that allow for deterministic signaling between three or more parties incompatible with any predefined causal order.
PERCEPTUAL SYSTEMS IN READING--THE PREDICTION OF A TEMPORAL EYE-VOICE SPAN CONSTANT. PAPER.
ERIC Educational Resources Information Center
GEYER, JOHN JACOB
A STUDY WAS CONDUCTED TO DELINEATE HOW PERCEPTION OCCURS DURING ORAL READING. FROM AN ANALYSIS OF CLASSICAL AND MODERN RESEARCH, A HEURISTIC MODEL WAS CONSTRUCTED WHICH DELINEATED THE DIRECTLY INTERACTING SYSTEMS POSTULATED AS FUNCTIONING DURING ORAL READING. THE MODEL AS OUTLINED WAS DIFFERENTIATED LOGICALLY INTO THREE MAJOR PROCESSING…
Fuzzy health, illness, and disease.
Sadegh-Zadeh, K
2000-10-01
The notions of health, illness, and disease are fuzzy-theoretically analyzed. They present themselves as non-Aristotelian concepts violating basic principles of classical logic. A recursive scheme for defining the controversial notion of disease is proposed that also supports a concept of fuzzy disease. A sketch is given of the prototype resemblance theory of disease.
Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model
ERIC Educational Resources Information Center
Sandaire, Johnny
2009-01-01
A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…
The Suffolk County Department of Social Services Performance Study. An Executive Summary.
ERIC Educational Resources Information Center
Spottheim, David; Wilson, George R.
The logic and methodology applied in a management science approach to performance and staff utilization in the Client Benefits (CBA) and Community Service (CSA) divisions of the Suffolk County (New York) Department of Social Services (SCDSS) are described. Using a blend of classical organization theory and management science techniques, the CBA…
Predicate calculus for an architecture of multiple neural networks
NASA Astrophysics Data System (ADS)
Consoli, Robert H.
1990-08-01
Future projects with neural networks will require multiple individual network components. Current efforts along these lines are ad hoc. This paper relates the neural network to a classical device and derives a multi-part architecture from that model. Further it provides a Predicate Calculus variant for describing the location and nature of the trainings and suggests Resolution Refutation as a method for determining the performance of the system as well as the location of needed trainings for specific proofs. 2. THE NEURAL NETWORK AND A CLASSICAL DEVICE Recently investigators have been making reports about architectures of multiple neural networksL234. These efforts are appearing at an early stage in neural network investigations they are characterized by architectures suggested directly by the problem space. Touretzky and Hinton suggest an architecture for processing logical statements1 the design of this architecture arises from the syntax of a restricted class of logical expressions and exhibits syntactic limitations. In similar fashion a multiple neural netword arises out of a control problem2 from the sequence learning problem3 and from the domain of machine learning. 4 But a general theory of multiple neural devices is missing. More general attempts to relate single or multiple neural networks to classical computing devices are not common although an attempt is made to relate single neural devices to a Turing machines and Sun et a!. develop a multiple neural architecture that performs pattern classification.
Gittelson, Simone; Kalafut, Tim; Myers, Steven; Taylor, Duncan; Hicks, Tacha; Taroni, Franco; Evett, Ian W; Bright, Jo-Anne; Buckleton, John
2016-01-01
The interpretation of complex DNA profiles is facilitated by a Bayesian approach. This approach requires the development of a pair of propositions: one aligned to the prosecution case and one to the defense case. This note explores the issue of proposition setting in an adversarial environment by a series of examples. A set of guidelines generalize how to formulate propositions when there is a single person of interest and when there are multiple individuals of interest. Additional explanations cover how to handle multiple defense propositions, relatives, and the transition from subsource level to activity level propositions. The propositions depend on case information and the allegations of each of the parties. The prosecution proposition is usually known. The authors suggest that a sensible proposition is selected for the defense that is consistent with their stance, if available, and consistent with a realistic defense if their position is not known. © 2015 American Academy of Forensic Sciences.
Zadeh, L A
2001-04-01
Interest in issues relating to consciousness has grown markedly during the last several years. And yet, nobody can claim that consciousness is a well-understood concept that lends itself to precise analysis. It may be argued that, as a concept, consciousness is much too complex to fit into the conceptual structure of existing theories based on Aristotelian logic and probability theory. An approach suggested in this paper links consciousness to perceptions and perceptions to their descriptors in a natural language. In this way, those aspects of consciousness which relate to reasoning and concept formation are linked to what is referred to as the methodology of computing with words (CW). Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language (e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc.). Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech, and summarizing a story. Underlying this remarkable capability is the brain's crucial ability to manipulate perceptions--perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood, and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions: a theory which may have an important bearing on how humans make--and machines might make--perception-based rational decisions in an environment of imprecision, uncertainty, and partial truth. A basic difference between perceptions and measurements is that, in general, measurements are crisp, whereas perceptions are fuzzy. One of the fundamental aims of science has been and continues to be that of progressing from perceptions to measurements. Pursuit of this aim has led to brilliant successes. We have sent men to the moon; we can build computers that are capable of performing billions of computations per second; we have constructed telescopes that can explore the far reaches of the universe; and we can date the age of rocks that are millions of years old. But alongside the brilliant successes stand conspicuous underachievements and outright failures. We cannot build robots that can move with the agility of animals or humans; we cannot automate driving in heavy traffic; we cannot translate from one language to another at the level of a human interpreter; we cannot create programs that can summarize non-trivial stories; our ability to model the behavior of economic systems leaves much to be desired; and we cannot build machines that can compete with children in the performance of a wide variety of physical and cognitive tasks. It may be argued that underlying the underachievements and failures is the unavailability of a methodology for reasoning and computing with perceptions rather than measurements. An outline of such a methodology--referred to as a computational theory of perceptions--is presented in this paper. The computational theory of perceptions (CTP) is based on the methodology of CW. In CTP, words play the role of labels of perceptions, and, more generally, perceptions are expressed as propositions in a natural language. CW-based techniques are employed to translate propositions expressed in a natural language into what is called the Generalized Constraint Language (GCL). In this language, the meaning of a proposition is expressed as a generalized constraint, X isr R, where X is the constrained variable, R is the constraining relation, and isr is a variable copula in which r is an indexing variable whose value defines the way in which R constrains X. Among the basic types of constraints are possibilistic, veristic, probabilistic, random set, Pawlak set, fuzzy graph, and usuality. The wide variety of constraints in GCL makes GCL a much more expressive language than the language of predicate logic. In CW, the initial and terminal data sets, IDS and TDS, are assumed to consist of propositions expressed in a natural language. These propositions are translated, respectively, into antecedent and consequent constraints. Consequent constraints are derived from antecedent constraints through the use of rules of constraint propagation. The principal constraint propagation rule is the generalized extension principle. (ABSTRACT TRUNCATED)
Fact or fallacy? Immunisation arguments in the New Zealand print media.
Petousis-Harris, Helen A; Goodyear-Smith, Felicity A; Kameshwar, Kamya; Turner, Nikki
2010-10-01
To explore New Zealand's four major daily newspapers' coverage of immunisation with regards to errors of fact and fallacy in construction of immunisation-related arguments. All articles from 2002 to 2007 were assessed for errors of fact and logic. Fact was defined as that which was supported by the most current evidence-based medical literature. Errors of logic were assessed using a classical taxonomy broadly based in Aristotle's classifications. Numerous errors of both fact and logic were identified, predominantly used by anti-immunisation proponents, but occasionally by health authorities. The proportion of media articles reporting exclusively fact changes over time during the life of a vaccine where new vaccines incur little fallacious reporting and established vaccines generate inaccurate claims. Fallacious arguments can be deconstructed and classified into a classical taxonomy including non sequitur and argumentum ad Hominem. Most media 'balance' given to immunisation relies on 'he said, she said' arguments using quotes from opposing spokespersons with a failure to verify the scientific validity of both the material and the source. Health professionals and media need training so that recognising and critiquing public health arguments becomes accepted practice: stronger public relations strategies should challenge poor quality articles to journalists' code of ethics and the health sector needs to be proactive in predicting and pre-empting the expected responses to introduction of new public health initiatives such as a new vaccine. © 2010 The Authors. Journal Compilation © 2010 Public Health Association of Australia.
Obstacle avoidance handling and mixed integer predictive control for space robots
NASA Astrophysics Data System (ADS)
Zong, Lijun; Luo, Jianjun; Wang, Mingming; Yuan, Jianping
2018-04-01
This paper presents a novel obstacle avoidance constraint and a mixed integer predictive control (MIPC) method for space robots avoiding obstacles and satisfying physical limits during performing tasks. Firstly, a novel kind of obstacle avoidance constraint of space robots, which needs the assumption that the manipulator links and the obstacles can be represented by convex bodies, is proposed by limiting the relative velocity between two closest points which are on the manipulator and the obstacle, respectively. Furthermore, the logical variables are introduced into the obstacle avoidance constraint, which have realized the constraint form is automatically changed to satisfy different obstacle avoidance requirements in different distance intervals between the space robot and the obstacle. Afterwards, the obstacle avoidance constraint and other system physical limits, such as joint angle ranges, the amplitude boundaries of joint velocities and joint torques, are described as inequality constraints of a quadratic programming (QP) problem by using the model predictive control (MPC) method. To guarantee the feasibility of the obtained multi-constraint QP problem, the constraints are treated as soft constraints and assigned levels of priority based on the propositional logic theory, which can realize that the constraints with lower priorities are always firstly violated to recover the feasibility of the QP problem. Since the logical variables have been introduced, the optimization problem including obstacle avoidance and system physical limits as prioritized inequality constraints is termed as MIPC method of space robots, and its computational complexity as well as possible strategies for reducing calculation amount are analyzed. Simulations of the space robot unfolding its manipulator and tracking the end-effector's desired trajectories with the existence of obstacles and physical limits are presented to demonstrate the effectiveness of the proposed obstacle avoidance strategy and MIPC control method of space robots.
Boniolo, Giovanni; D'Agostino, Marcello; Di Fiore, Pier Paolo
2010-03-03
We propose a formal language that allows for transposing biological information precisely and rigorously into machine-readable information. This language, which we call Zsyntax (where Z stands for the Greek word zetaomegaeta, life), is grounded on a particular type of non-classical logic, and it can be used to write algorithms and computer programs. We present it as a first step towards a comprehensive formal language for molecular biology in which any biological process can be written and analyzed as a sort of logical "deduction". Moreover, we illustrate the potential value of this language, both in the field of text mining and in that of biological prediction.
Whence Structured Propositions?
ERIC Educational Resources Information Center
Keller, Lorraine Juliano
2012-01-01
This thesis is a critical examination of "Structured Propositionalism" (SP), the view that propositions are complex entities composed of the semantic values of the (meaningful) parts of the sentences that express them. According to SP, propositions have constituents and are individuated by the identity and arrangement of their…
Belief Inhibition in Children's Reasoning: Memory-Based Evidence
ERIC Educational Resources Information Center
Steegen, Sara; Neys, Wim De
2012-01-01
Adult reasoning has been shown as mediated by the inhibition of intuitive beliefs that are in conflict with logic. The current study introduces a classic procedure from the memory field to investigate belief inhibition in 12- to 17-year-old reasoners. A lexical decision task was used to probe the memory accessibility of beliefs that were cued…
A Pedagogical Approach to the Boltzmann Factor through Experiments and Simulations
ERIC Educational Resources Information Center
Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.
2009-01-01
The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to…
Fuzzy logic controller for the LOLA AO tip-tilt corrector system
NASA Astrophysics Data System (ADS)
Sotelo, Pablo D.; Flores, Ruben; Garfias, Fernando; Cuevas, Salvador
1998-09-01
At the INSTITUTO DE ASTRONOMIA we developed an adaptive optics system for the correction of the two first orders of the Zernike polynomials measuring the image controid. Here we discus the two system modalities based in two different control strategies and we present simulations comparing the systems. For the classic control system we present telescope results.
Mass, Momentum and Kinetic Energy of a Relativistic Particle
ERIC Educational Resources Information Center
Zanchini, Enzo
2010-01-01
A rigorous definition of mass in special relativity, proposed in a recent paper, is recalled and employed to obtain simple and rigorous deductions of the expressions of momentum and kinetic energy for a relativistic particle. The whole logical framework appears as the natural extension of the classical one. Only the first, second and third laws of…
"Needle and Stick" Save the World: Sustainable Development and the Universal Child
ERIC Educational Resources Information Center
Dahlbeck, Johan; De Lucia Dahlbeck, Moa
2012-01-01
This text deals with a problem concerning processes of the productive power of knowledge. We draw on the so-called poststructural theories challenging the classical image of thought--as hinged upon a representational logic identifying entities in a rigid sense--when formulating a problem concerning the gap between knowledge and the object of…
The Logical Heart of a Classic Proof Revisited: A Guide to Godel's "Incompleteness" Theorems
ERIC Educational Resources Information Center
Padula, Janice
2011-01-01
The study of Kurt Godel's proof of the "incompleteness" of a formal system such as "Principia Mathematica" is a great way to stimulate students' thinking and creative processes and interest in mathematics and its important developments. This article describes salient features of the proof together with ways to deal with potential difficulties for…
Fell Running and Voluptuous Panic: On Caillois and Post-Sport Physical Culture
ERIC Educational Resources Information Center
Atkinson, Michael
2011-01-01
As many cultural groups in Western societies have become disaffected with mainstream sports cultures and their logics of practice, sociologists of sport and physical culture have turned their attention to the existential benefits of play and games. There is growing interest in revisiting and exploring the classic theories of play in society,…
[Argumentation and construction of validity in Carlos Matus' situational strategic planning].
Rivera, Francisco Javier Uribe
2011-09-01
This study analyzes the process of producing a situational plan according to a benchmark from the philosophy of language and argumentation theory. The basic approach used in the analysis was developed by Carlos Matus. Specifically, the study seeks to identify the inherent argumentative structure and patterns in the situational explanation and regulatory design in a plan's operations, taking argumentative approaches from pragma-dialectics and informal logic as the analytical parameters. The explanation of a health problem is used to illustrate the study. Methodologically, the study is based on the existing literature on the subject and case analyses. The study concludes with the proposition that the use of the specific references means introducing greater rigor into both the analysis of the validity of causal arguments and the design of proposals for interventions, in order for them to be more conclusive in achieving a plan's objectives.
NASA Astrophysics Data System (ADS)
Héraud, Jean-Loup; Lautesse, Philippe; Ferlin, Fabrice; Chabot, Hugues
2017-05-01
Our work extends a previous study of epistemological presuppositions in teaching quantum physics in upper scientific secondary school in France. Here, the problematic reference of quantum theory's concepts is treated at the ontological level (the counterintuitive nature of quantum objects). We consider the approach of using narratives describing possible alternative worlds to address the issue. These possible worlds are based on the counterfactual logic developed in the work of D. Lewis. We will show that the narratives written by G. Gamow describe such possible worlds. Some parts of these narratives are found in textbooks in France. These worlds are governed by laws similar to but importantly different from those in our real world. They allow us to materialize properties inaccessible to everyday experience. In this sense, these fiction stories make ontological propositions concerning the nature and structure of the fundamental elements of our physical universe.
Mixed-initiative control of intelligent systems
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1987-01-01
Mixed-initiative user interfaces provide a means by which a human operator and an intelligent system may collectively share the task of deciding what to do next. Such interfaces are important to the effective utilization of real-time expert systems as assistants in the execution of critical tasks. Presented here is the Incremental Inference algorithm, a symbolic reasoning mechanism based on propositional logic and suited to the construction of mixed-initiative interfaces. The algorithm is similar in some respects to the Truth Maintenance System, but replaces the notion of 'justifications' with a notion of recency, allowing newer values to override older values yet permitting various interested parties to refresh these values as they become older and thus more vulnerable to change. A simple example is given of the use of the Incremental Inference algorithm plus an overview of the integration of this mechanism within the SPECTRUM expert system for geological interpretation of imaging spectrometer data.
Gupta, Vishal K; Han, Seonghee; Mortal, Sandra C; Silveri, Sabatino Dino; Turban, Daniel B
2018-02-01
We examine the glass cliff proposition that female CEOs receive more scrutiny than male CEOs, by investigating whether CEO gender is related to threats from activist investors in public firms. Activist investors are extraorganizational stakeholders who, when dissatisfied with some aspect of the way the firm is being managed, seek to change the strategy or operations of the firm. Although some have argued that women will be viewed more favorably than men in top leadership positions (so-called "female leadership" advantage logic), we build on role congruity theory to hypothesize that female CEOs are significantly more likely than male CEOs to come under threat from activist investors. Results support our predictions, suggesting that female CEOs may face additional challenges not faced by male CEOs. Practical implications and directions for future research are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
On the nature of explanation: A PDP approach
NASA Astrophysics Data System (ADS)
Churchland, Paul M.
1990-06-01
Neural network models of sensory processing and associative memory provide the resources for a new theory of what explanatory understanding consists in. That theory finds the theoretically important factors to reside not at the level of propositions and the relations between them, but at the level of the activation patterns across large populations of neurons. The theory portrays explanatory understanding, perceptual recognition, and abductive inference as being different instances of the same more general sort of cognitive achievement, viz. prototype activation. It thus effects a unification of the theories of explanation, perception, and ampliative inference. It also finds systematic unity in the wide diversity of types of explanation (causal, functional, mathematical, intentional, reductive, etc.), a chronic problem for theories of explanation in the logico-linguistic tradition. Finally, it is free of the many defects, both logical and psychological, that plague models in that older tradition.
Uncovering Knowledge of Core Syntactic and Semantic Principles in Individuals With Williams Syndrome
Musolino, Julien; Chunyo, Gitana; Landau, Barbara
2011-01-01
We investigate knowledge of core syntactic and semantic principles in individuals with Williams Syndrome (WS). Our study focuses on the logico-syntactic properties of negation and disjunction (or) and tests knowledge of (a) core syntactic relations (scope and c-command), (b) core semantic relations (entailment relations and DeMorgan’s laws of propositional logic), and (c) the relationship between (a) and (b). We examine the performance of individuals with WS, children matched for mental age (MA), and typical adult native speakers of English. Performance on all conditions suggests that knowledge of (a-c) is present and engaged in all three groups. Results also indicate slightly depressed performance on (c) for the WS group, compared to MA, consistent with limitation in processing resources. Implications of these results for competing accounts of language development in WS, as well as for the relevance of WS to the study of cognitive architecture and development are discussed. PMID:21866219
Miller, Lantz Fleming
2014-12-01
Philosophers, scientists, and other researchers have increasingly characterized humanity as having reached an epistemic and technical stage at which “we can control our own evolution.” Moral–philosophical analysis of this outlook reveals some problems, beginning with the vagueness of “we.” At least four glosses on “we” in the proposition “we, humanity, control our evolution” can be made: “we” is the bundle of all living humans, a leader guiding the combined species, each individual acting severally, or some mixture of these three involving a market interpretation of future evolutionary processes. While all of these glosses have difficulties under philosophical analysis, how we as a species handle our fate via technical developments is all-important. I propose our role herein should be understood as other than controllers of our evolution.
Köchy, Kristian
2010-03-01
In the 1920s and 1930s three different but simultaneous approaches of philosophy of science can be distinguished: the logical approach of the physicist Rudolf Carnap, the logico-historical approach of the psychologist Kurt Lewin and the socio-historical approach of the medical scientist Ludwik Fleck. While the philosophies of Lewin and Fleck can be characterized as contextual appraisals which account for the interactions between particular sciences and their historical, socio-cultural or intellectual environments, Carnap's philosohy is narrowed to an internal methodology centered on scientific propositions and ogical structures in general. In addition to these differences in aim and practice of methodological analysis the estimation of the real disunity and diversity of the special branches of science differs. Instead of Carnap's ideal of a unified science from the new pluralistic point of view the evaluation of the empirical multiplicity of particular sciences obtains philosophical acceptance.
A Critical Review of Proposition Analysis in Alzheimer's Research and Elsewhere
ERIC Educational Resources Information Center
King, James R.
2012-01-01
Propositional analysis of text, including the generation of proposition density ratios, is examined within the context of Alzheimer's research. A discussion of linguistic modularity raises questions regarding the outcomes of propositional analysis and its applications in Alzheimer's research. (Contains 1 figure.)
Narrative Abilities in Hearing-Impaired Children: Propositions and Cohesion.
ERIC Educational Resources Information Center
Griffith, Penny L.; And Others
1990-01-01
Two linguistic microstructures (propositions and cohesive devices) were analyzed in story recalls by 11 primary and intermediate level hearing-impaired students. When stories were very simple, students generated mostly complete propositions, however as complexity increased, semantic errors resulted in fewer complete propositions. (Author/DB)
Gruenfeld, D H; Wyer, R S
1992-01-01
Ss read either affirmations or denials of target propositions that ostensibly came from either newspapers or reference volumes. Denials of the validity of a proposition that was already assumed to be false increased Ss' beliefs in this proposition. The effect generalized to beliefs in related propositions that could be used to support the target's validity. When denials came from a newspaper, their "boomerang effect" was nearly equal in magnitude to the direct effect of affirming the target proposition's validity. When Ss were asked explicitly to consider the implications of the assertions, however, the impact of denials was eliminated. Affirmations of a target proposition that was already assumed to be true also had a boomerang effect. Results have implications for the effects of both semantic and pragmatic processing of assertions on belief change.
Modelling Of Anticipated Damage Ratio On Breakwaters Using Fuzzy Logic
NASA Astrophysics Data System (ADS)
Mercan, D. E.; Yagci, O.; Kabdasli, S.
2003-04-01
In breakwater design the determination of armour unit weight is especially important in terms of the structure's life. In a typical experimental breakwater stability study, different wave series composed of different wave heights; wave period and wave steepness characteristics are applied in order to investigate performance the structure. Using a classical approach, a regression equation is generated for damage ratio as a function of characteristic wave height. The parameters wave period and wave steepness are not considered. In this study, differing from the classical approach using a fuzzy logic, a relationship between damage ratio as a function of mean wave period (T_m), wave steepness (H_s/L_m) and significant wave height (H_s) was further generated. The system's inputs were mean wave period (T_m), wave steepness (H_s/L_m) and significant wave height (H_s). For fuzzification all input variables were divided into three fuzzy subsets, their membership functions were defined using method developed by Mandani (Mandani, 1974) and the rules were written. While for defuzzification the centroid method was used. In order to calibrate and test the generated models an experimental study was conducted. The experiments were performed in a wave flume (24 m long, 1.0 m wide and 1.0 m high) using 20 different irregular wave series (P-M spectrum). Throughout the study, the water depth was 0.6 m and the breakwater cross-sectional slope was 1V/2H. In the armour layer, a type of artificial armour unit known as antifer cubes were used. The results of the established fuzzy logic model and regression equation model was compared with experimental data and it was determined that the established fuzzy logic model gave a more accurate prediction of the damage ratio on this type of breakwater. References Mandani, E.H., "Application of Fuzzy Algorithms for Control of Simple Dynamic Plant", Proc. IEE, vol. 121, no. 12, December 1974.
SMART Grid Evaluation Using Fuzzy Numbers and TOPSIS
NASA Astrophysics Data System (ADS)
El Alaoui, Mohammed
2018-05-01
In recent advent of smart grids, the end-users aims to satisfy simultaneously low electricity bills, with a reasonable level of comfort. While cost evaluation appears to be an easy task, capturing human preferences seems to be more challenging. Here we propose the use of fuzzy logic and a modified version of the TOPSIS method, to quantify end-users’ preferences in a smart grid. While classical smart grid focus only on the technological side, it is proven that smart grid effectiveness is hugely linked to end-users’ behaviours. The main objective here, is to involve smart grid users in order to get maximum satisfaction, preserving classical smart grid objectives.
Error rates and resource overheads of encoded three-qubit gates
NASA Astrophysics Data System (ADS)
Takagi, Ryuji; Yoder, Theodore J.; Chuang, Isaac L.
2017-10-01
A non-Clifford gate is required for universal quantum computation, and, typically, this is the most error-prone and resource-intensive logical operation on an error-correcting code. Small, single-qubit rotations are popular choices for this non-Clifford gate, but certain three-qubit gates, such as Toffoli or controlled-controlled-Z (ccz), are equivalent options that are also more suited for implementing some quantum algorithms, for instance, those with coherent classical subroutines. Here, we calculate error rates and resource overheads for implementing logical ccz with pieceable fault tolerance, a nontransversal method for implementing logical gates. We provide a comparison with a nonlocal magic-state scheme on a concatenated code and a local magic-state scheme on the surface code. We find the pieceable fault-tolerance scheme particularly advantaged over magic states on concatenated codes and in certain regimes over magic states on the surface code. Our results suggest that pieceable fault tolerance is a promising candidate for fault tolerance in a near-future quantum computer.
Bidirectional automatic release of reserve for low voltage network made with low capacity PLCs
NASA Astrophysics Data System (ADS)
Popa, I.; Popa, G. N.; Diniş, C. M.; Deaconu, S. I.
2018-01-01
The article presents the design of a bidirectional automatic release of reserve made on two types low capacity programmable logic controllers: PS-3 from Klöckner-Moeller and Zelio from Schneider. It analyses the electronic timing circuits that can be used for making the bidirectional automatic release of reserve: time-on delay circuit and time-off delay circuit (two types). In the paper are present the sequences code for timing performed on the PS-3 PLC, the logical functions for the bidirectional automatic release of reserve, the classical control electrical diagram (with contacts, relays, and time relays), the electronic control diagram (with logical gates and timing circuits), the code (in IL language) made for the PS-3 PLC, and the code (in FBD language) made for Zelio PLC. A comparative analysis will be carried out on the use of the two types of PLC and will be present the advantages of using PLCs.
Cognitive characteristics of learning Java, an object-oriented programming language
NASA Astrophysics Data System (ADS)
White, Garry Lynn
Industry and Academia are moving from procedural programming languages (e.g., COBOL) to object-oriented programming languages, such as Java for the Internet. Past studies in the cognitive aspects of programming have focused primarily on procedural programming languages. Some of the languages used have been Pascal, C, Basic, FORTAN, and COBOL. Object-oriented programming (OOP) represents a new paradigm for computing. Industry is finding that programmers are having difficulty shifting to this new programming paradigm. This instruction in OOP is currently starting in colleges and universities across the country. What are the cognitive aspects for this new OOP language Java? When is a student developmentally ready to handle the cognitive characteristics of the OOP language Java? Which cognitive teaching style is best for this OOP language Java? Questions such as the aforementioned are the focus of this research Such research is needed to improve understanding of the learning process and identify students' difficulties with OOP methods. This can enhance academic teaching and industry training (Scholtz, 1993; Sheetz, 1997; Rosson, 1990). Cognitive development as measured by the Propositional Logic Test, cognitive style as measured by the Hemispheric Mode Indicator, and physical hemispheric dominance as measured by a self-report survey were obtained from thirty-six university students studying Java programming. Findings reveal that physical hemispheric dominance is unrelated to cognitive and programming language variables. However, both procedural and object oriented programming require Piaget's formal operation cognitive level as indicated by the Propositional Logic Test. This is consistent with prior research A new finding is that object oriented programming also requires formal operation cognitive level. Another new finding is that object oriented programming appears to be unrelated to hemispheric cognitive style as indicated by the Hemispheric Mode Indicator (HMI). This research suggests that object oriented programming is hemispheric thinking style friendly, while procedural programming is left hemispheric cognitive style. The conclusion is that cognitive characteristics are not the cause for the difficulty in shifting from procedural to this new programming paradigm of object oriented programming. An alternative possibility to the difficulty is proactive interference. Prior learning of procedural programming makes it harder to learning object oriented programming. Further research is needed to determine if proactive interference is the cause for the difficulty in shifting from procedural programming to object oriented programming.
Debating Values: Key Issues in Formatting an Argumentative Case.
ERIC Educational Resources Information Center
Scott, David K.
This paper analyzes the components of an "ideal" debate using a non-policy proposition. It is argued that debates using non-policy propositions are currently plagued by a variety of problems. Value propositions on the college level are dissimilar to the value propositions used in high school Lincoln-Douglas debate. Many debaters are…
Debating Historical Propositions: Toward a Unique Genre of NEDA Debate.
ERIC Educational Resources Information Center
Scott, David K.
The best way to develop a unique identity for the National Education Debate Association (NEDA) is to debate propositions distinct from National Debate Tournament (NDT) and the Cross Examination Debate Association (CEDA). A neglected area of debate includes propositions temporally framed in the past. Yet, the present propositional categories of…
Propositional density and cognitive function in later life: findings from the Precursors Study.
Engelman, Michal; Agree, Emily M; Meoni, Lucy A; Klag, Michael J
2010-11-01
We used longitudinal data from the Johns Hopkins Precursors Study to test the hypothesis that written propositional density measured early in life is lower for people who develop dementia categorized as Alzheimer's disease (AD). This association was reported in 1996 for the Nun Study, and the Precursors Study offered an unprecedented chance to reexamine it among respondents with different gender, education, and occupation profiles. Eighteen individuals classified as AD patients (average age at diagnosis: 74) were assigned 2 sex-and-age matched controls, and propositional density in medical school admission essays (average age at writing: 22) was assessed via Computerized Propositional Idea Density Rater 3 linguistic analysis software. Adjusted odds ratios (ORs) for the matched case-control study were calculated using conditional (fixed-effects) logistic regression. Mean propositional density is lower for cases than for controls (4.70 vs. 4.99 propositions per 10 words, 1-sided p = .01). Higher propositional density substantially lowers the odds of AD (OR = 0.16, 95% confidence interval = 0.03-0.90, 1-sided p = .02). Propositional density scores in writing samples from early adulthood appear to predict AD in later life for men as well as women. Studies of cognition across the life course might beneficially incorporate propositional density as a potential marker of cognitive reserve.
A "Sense-able" Approach to Classical Argument (Instructional Note).
ERIC Educational Resources Information Center
Beaman, Marian L.
1995-01-01
Describes two practices that make the writing of a persuasive essay more manageable for writing students: (1) a game in which students attempt to match a list of logical fallacies with an instance in Max Shulman's "Love Is a Fallacy"; and (2) a graphic arrangement that students can use to organize their ideas on one page before writing.…
ERIC Educational Resources Information Center
Eemeren, F. H. van; Grootendorst, R.
Suitable methods can be developed and instructional devices can be designed for the teaching of argumentation analysis to students of varying interests, ages, and capacities. Until 1950, the study of argumentation in the Netherlands was either purely practical or a continuation of the classic logic and rhetoric traditions. A number of new research…
Quantum structure of negation and conjunction in human thought
Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas
2015-01-01
We analyze in this paper the data collected in a set of experiments investigating how people combine natural concepts. We study the mutual influence of conceptual conjunction and negation by measuring the membership weights of a list of exemplars with respect to two concepts, e.g., Fruits and Vegetables, and their conjunction Fruits And Vegetables, but also their conjunction when one or both concepts are negated, namely, Fruits And Not Vegetables, Not Fruits And Vegetables, and Not Fruits And Not Vegetables. Our findings sharpen and advance existing analysis on conceptual combinations, revealing systematic deviations from classical (fuzzy set) logic and probability theory. And, more important, our results give further considerable evidence to the validity of our quantum-theoretic framework for the combination of two concepts. Indeed, the representation of conceptual negation naturally arises from the general assumptions of our two-sector Fock space model, and this representation faithfully agrees with the collected data. In addition, we find a new significant and a priori unexpected deviation from classicality, which can exactly be explained by assuming that human reasoning is the superposition of an “emergent reasoning” and a “logical reasoning,” and that these two processes are represented in a Fock space algebraic structure. PMID:26483715
Zero Thermal Noise in Resistors at Zero Temperature
NASA Astrophysics Data System (ADS)
Kish, Laszlo B.; Niklasson, Gunnar A.; Granqvist, Claes-Göran
2016-06-01
The bandwidth of transistors in logic devices approaches the quantum limit, where Johnson noise and associated error rates are supposed to be strongly enhanced. However, the related theory — asserting a temperature-independent quantum zero-point (ZP) contribution to Johnson noise, which dominates the quantum regime — is controversial and resolution of the controversy is essential to determine the real error rate and fundamental energy dissipation limits of logic gates in the quantum limit. The Callen-Welton formula (fluctuation-dissipation theorem) of voltage and current noise for a resistance is the sum of Nyquist’s classical Johnson noise equation and a quantum ZP term with a power density spectrum proportional to frequency and independent of temperature. The classical Johnson-Nyquist formula vanishes at the approach of zero temperature, but the quantum ZP term still predicts non-zero noise voltage and current. Here, we show that this noise cannot be reconciled with the Fermi-Dirac distribution, which defines the thermodynamics of electrons according to quantum-statistical physics. Consequently, Johnson noise must be nil at zero temperature, and non-zero noise found for certain experimental arrangements may be a measurement artifact, such as the one mentioned in Kleen’s uncertainty relation argument.
Propositional Density and Cognitive Function in Later Life: Findings From the Precursors Study
Agree, Emily M.; Meoni, Lucy A.; Klag, Michael J.
2010-01-01
Objectives. We used longitudinal data from the Johns Hopkins Precursors Study to test the hypothesis that written propositional density measured early in life is lower for people who develop dementia categorized as Alzheimer's disease (AD). This association was reported in 1996 for the Nun Study, and the Precursors Study offered an unprecedented chance to reexamine it among respondents with different gender, education, and occupation profiles. Methods. Eighteen individuals classified as AD patients (average age at diagnosis: 74) were assigned 2 sex-and-age matched controls, and propositional density in medical school admission essays (average age at writing: 22) was assessed via Computerized Propositional Idea Density Rater 3 linguistic analysis software. Adjusted odds ratios (ORs) for the matched case-control study were calculated using conditional (fixed-effects) logistic regression. Results. Mean propositional density is lower for cases than for controls (4.70 vs. 4.99 propositions per 10 words, 1-sided p = .01). Higher propositional density substantially lowers the odds of AD (OR = 0.16, 95% confidence interval = 0.03-0.90, 1-sided p = .02). Discussion. Propositional density scores in writing samples from early adulthood appear to predict AD in later life for men as well as women. Studies of cognition across the life course might beneficially incorporate propositional density as a potential marker of cognitive reserve. PMID:20837676
Antony S. Cheng; Linda E. Kruger; Steven E. Daniels
2003-01-01
This article lays out six propositions centering on a relationship between peopleplace connections and strategic behavior in natural resource politics. The first two propositions suggest a strong and direct connection between self-identity, place, and how individuals perceive and value the environment. The third, fourth, and fifth propositions tie together social group...
Memorisation methods in science education: tactics to improve the teaching and learning practice
NASA Astrophysics Data System (ADS)
Pals, Frits F. B.; Tolboom, Jos L. J.; Suhre, Cor J. M.; van Geert, Paul L. C.
2018-01-01
How can science teachers support students in developing an appropriate declarative knowledge base for solving problems? This article focuses on the question whether the development of students' memory of scientific propositions is better served by writing propositions down on paper or by making drawings of propositions either by silent or muttering rehearsal. By means of a memorisation experiment with eighth- and ninth-grade students, we answer this question. In this experiment, students received instruction to memorise nine science propositions and to reproduce them afterwards. To support memorisation students were randomly assigned either to a group that received instruction to write each proposition on paper or to a group that received instruction to make a drawing about the content of the proposition. In addition, half of the students in both groups received instruction to mutter and the other half of them received instruction to write or draw in silence. The main conclusion from the experiment is that after four weeks students who had made a drawing remembered significantly more propositions than those who had memorised the propositions by writing them down. Our research further revealed that it did not matter whether students muttered or memorised silently.
Interpreting Hypernymic Propositions in an Online Medical Encyclopedia
Fiszman, Marcelo; Rindflesch, Thomas C.; Kilicoglu, Halil
2003-01-01
Interpretation of semantic propositions from biomedical texts documents would provide valuable support to natural language processing (NLP) applications. We are developing a methodology to interpret a kind of semantic proposition, the hypernymic proposition, in MEDLINE abstracts. In this paper, we expanded the system to identify these structures in a different discourse domain: the Medical Encyclopedia from the National Library of Medicine’s MEDLINEplus® Website. PMID:14728345
Interpreting hypernymic propositions in an online medical encyclopedia.
Fiszman, Marcelo; Rindflesch, Thomas C; Kilicoglu, Halil
2003-01-01
Interpretation of semantic propositions from bio-medical texts documents would provide valuable support to natural language processing (NLP) applications. We are developing a methodology to interpret a kind of semantic proposition, the hypernymic proposition, in MEDLINE abstracts. In this paper, we expanded the system to identify these structures in a different discourse domain: the Medical Encyclopedia from the National Library of Medi-cine's MEDLINEplus Website.
ERIC Educational Resources Information Center
Vaidya, Anand Jayprakash
2017-01-01
In this paper I develop a cross-cultural critique of contemporary critical thinking education in the United States, the United Kingdom, and those educational systems that adopt critical thinking education from the standard model used in the US and UK. The cross-cultural critique rests on the idea that contemporary critical thinking textbooks…
Negative Priming Effect after Inhibition of Weight/Number Interference in a Piaget-Like Task
ERIC Educational Resources Information Center
Schirlin, Olivier; Houde, Olivier
2007-01-01
Piagetian tasks have more to do with the child's ability to inhibit interference than they do with the ability to grasp their underlying logic. Here we used a chronometric paradigm with 11-year-olds, who succeed in Piaget's conservation-of-weight task, to test the role of cognitive inhibition in a priming version of this classical task. The…
An overview of the multi-database manipulation language MDSL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Litwin, W.; Abdellatif, A.
With the increase in availability of databases, data needed by a user are frequently in separate autonomous databases. The logical properties of such data differ from the classical ones with a single database. In particular, they call for new functions for data manipulation. MDSL is a new data manipulation language providing such functions. Most of the MDSL functions are not available in other languages.
Multi-bit dark state memory: Double quantum dot as an electronic quantum memory
NASA Astrophysics Data System (ADS)
Aharon, Eran; Pozner, Roni; Lifshitz, Efrat; Peskin, Uri
2016-12-01
Quantum dot clusters enable the creation of dark states which preserve electrons or holes in a coherent superposition of dot states for a long time. Various quantum logic devices can be envisioned to arise from the possibility of storing such trapped particles for future release on demand. In this work, we consider a double quantum dot memory device, which enables the preservation of a coherent state to be released as multiple classical bits. Our unique device architecture uses an external gating for storing (writing) the coherent state and for retrieving (reading) the classical bits, in addition to exploiting an internal gating effect for the preservation of the coherent state.
Superconducting Qubit with Integrated Single Flux Quantum Controller Part I: Theory and Fabrication
NASA Astrophysics Data System (ADS)
Beck, Matthew; Leonard, Edward, Jr.; Thorbeck, Ted; Zhu, Shaojiang; Howington, Caleb; Nelson, Jj; Plourde, Britton; McDermott, Robert
As the size of quantum processors grow, so do the classical control requirements. The single flux quantum (SFQ) Josephson digital logic family offers an attractive route to proximal classical control of multi-qubit processors. Here we describe coherent control of qubits via trains of SFQ pulses. We discuss the fabrication of an SFQ-based pulse generator and a superconducting transmon qubit on a single chip. Sources of excess microwave loss stemming from the complex multilayer fabrication of the SFQ circuit are discussed. We show how to mitigate this loss through judicious choice of process workflow and appropriate use of sacrificial protection layers. Present address: IBM T.J. Watson Research Center.
A One-System Theory Which is Not Propositional.
Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R
2009-04-01
We argue that the propositional and link-based approaches to human contingency learning represent different levels of analysis because propositional reasoning requires a basis, which is plausibly provided by a link-based architecture. Moreover, in their attempt to compare two general classes of models (link-based and propositional), Mitchell et al. have referred to only two generic models and ignore the large variety of different models within each class.
Fuzzy control of small servo motors
NASA Technical Reports Server (NTRS)
Maor, Ron; Jani, Yashvant
1993-01-01
To explore the benefits of fuzzy logic and understand the differences between the classical control methods and fuzzy control methods, the Togai InfraLogic applications engineering staff developed and implemented a motor control system for small servo motors. The motor assembly for testing the fuzzy and conventional controllers consist of servo motor RA13M and an encoder with a range of 4096 counts. An interface card was designed and fabricated to interface the motor assembly and encoder to an IBM PC. The fuzzy logic based motor controller was developed using the TILShell and Fuzzy C Development System on an IBM PC. A Proportional-Derivative (PD) type conventional controller was also developed and implemented in the IBM PC to compare the performance with the fuzzy controller. Test cases were defined to include step inputs of 90 and 180 degrees rotation, sine and square wave profiles in 5 to 20 hertz frequency range, as well as ramp inputs. In this paper we describe our approach to develop a fuzzy as well as PH controller, provide details of hardware set-up and test cases, and discuss the performance results. In comparison, the fuzzy logic based controller handles the non-linearities of the motor assembly very well and provides excellent control over a broad range of parameters. Fuzzy technology, as indicated by our results, possesses inherent adaptive features.
Statistical inference and Aristotle's Rhetoric.
Macdonald, Ranald R
2004-11-01
Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.
Flux qubit interaction with rapid single-flux quantum logic circuits: Control and readout
NASA Astrophysics Data System (ADS)
Klenov, N. V.; Kuznetsov, A. V.; Soloviev, I. I.; Bakurskiy, S. V.; Denisenko, M. V.; Satanin, A. M.
2017-07-01
We present the results of an analytical study and numerical simulation of the dynamics of a superconducting three-Josephson-junction (3JJ) flux qubit magnetically coupled with rapid single-flux quantum (RSFQ) logic circuit, which demonstrate the fundamental possibility of implementing the simplest logic operations at picosecond times, as well as rapid non-destructive readout. It is shown that when solving optimization problems, the qubit dynamics can be conveniently interpreted as a precession of the magnetic moment vector around the direction of the magnetic field. In this case, the role of magnetic field components is played by combinations of the Hamiltonian matrix elements, and the role of the magnetic moment is played by the Bloch vector. Features of the 3JJ qubit model are discussed during the analysis of how the qubit is affected by exposure to a short control pulse, as are the similarities between the Bloch and Landau-Lifshitz-Gilbert equations. An analysis of solutions to the Bloch equations made it possible to develop recommendations for the use of readout RSFQ circuits in implementing an optimal interface between the classical and quantum parts of the computer system, as well as to justify the use of single-quantum logic in order to control superconducting quantum circuits on a chip.
Integrating a Hypernymic Proposition Interpreter into a Semantic Processor for Biomedical Texts
Fiszman, Marcelo; Rindflesch, Thomas C.; Kilicoglu, Halil
2003-01-01
Semantic processing provides the potential for producing high quality results in natural language processing (NLP) applications in the biomedical domain. In this paper, we address a specific semantic phenomenon, the hypernymic proposition, and concentrate on integrating the interpretation of such predications into a more general semantic processor in order to improve overall accuracy. A preliminary evaluation assesses the contribution of hypernymic propositions in providing more specific semantic predications and thus improving effectiveness in retrieving treatment propositions in MEDLINE abstracts. Finally, we discuss the generalization of this methodology to additional semantic propositions as well as other types of biomedical texts. PMID:14728170
NASA Astrophysics Data System (ADS)
Vijayakumar, P.; Ramasamy, P.
2016-08-01
AgGa0.5In0.5Se2 single crystal was grown using modified vertical Bridgman method. The structural perfection of the AgGa0.5In0.5Se2 single crystal has been analyzed by high-resolution X-ray diffraction rocking curve measurements. The structural and compositional uniformities of AgGa0.5In0.5Se2 were studied using Raman scattering spectroscopy at room temperature. The FWHM of the Γ1 (W1) and Γ5L (Γ15) measured at different regions of the crystal confirms that the composition throughout its length is fairly uniform. Thermal properties of the as-grown crystal, including specific heat, thermal diffusivity and thermal conductivity have been investigated. The multiple shot surface laser damage threshold value was measured using Nd:YAG laser. Photoconductivity measurements with different temperatures have confirmed the positive photoconducting behavior. Second harmonic generation (SHG) on powder samples has been measured using the Kurtz and Perry technique and the results display that AgGa0.5In0.5Se2 is a phase-matchable NLO material. The hardness behavior has been measured using Vickers micro hardness measurement and the indentation size effect has been observed. The classical Meyer's law, propositional resistance model and modified propositional resistance model have been used to analyse the micro hardness behavior.
Micro-Macro Duality and Space-Time Emergence
NASA Astrophysics Data System (ADS)
Ojima, Izumi
2011-03-01
The microscopic origin of space-time geometry is explained on the basis of an emergence process associated with the condensation of infinite number of microscopic quanta responsible for symmetry breakdown, which implements the basic essence of "Quantum-Classical Correspondence" and of the forcing method in physical and mathematical contexts, respectively. From this viewpoint, the space-time dependence of physical quantities arises from the "logical extension" [8] to change "constant objects" into "variable objects" by tagging the order parameters associated with the condensation onto "constant objects"; the logical direction here from a value y to a domain variable x (to materialize the basic mechanism behind the Gel'fand isomorphism) is just opposite to that common in the usual definition of a function ƒ : x⟼ƒ(x) from its domain variable x to a value y = ƒ(x).
Synthesis of energy-efficient FSMs implemented in PLD circuits
NASA Astrophysics Data System (ADS)
Nawrot, Radosław; Kulisz, Józef; Kania, Dariusz
2017-11-01
The paper presents an outline of a simple synthesis method of energy-efficient FSMs. The idea consists in using local clock gating to selectively block the clock signal, if no transition of a state of a memory element is required. The research was dedicated to logic circuits using Programmable Logic Devices as the implementation platform, but the conclusions can be applied to any synchronous circuit. The experimental section reports a comparison of three methods of implementing sequential circuits in PLDs with respect to clock distribution: the classical fully synchronous structure, the structure exploiting the Enable Clock inputs of memory elements, and the structure using clock gating. The results show that the approach based on clock gating is the most efficient one, and it leads to significant reduction of dynamic power consumed by the FSM.
Formal Modeling of Multi-Agent Systems using the Pi-Calculus and Epistemic Logic
NASA Technical Reports Server (NTRS)
Rorie, Toinette; Esterline, Albert
1998-01-01
Multi-agent systems have become important recently in computer science, especially in artificial intelligence (AI). We allow a broad sense of agent, but require at least that an agent has some measure of autonomy and interacts with other agents via some kind of agent communication language. We are concerned in this paper with formal modeling of multi-agent systems, with emphasis on communication. We propose for this purpose to use the pi-calculus, an extension of the process algebra CCS. Although the literature on the pi-calculus refers to agents, the term is used there in the sense of a process in general. It is our contention, however, that viewing agents in the AI sense as agents in the pi-calculus sense affords significant formal insight. One formalism that has been applied to agents in the AI sense is epistemic logic, the logic of knowledge. The success of epistemic logic in computer science in general has come in large part from its ability to handle concepts of knowledge that apply to groups. We maintain that the pi-calculus affords a natural yet rigorous means by which groups that are significant to epistemic logic may be identified, encapsulated, structured into hierarchies, and restructured in a principled way. This paper is organized as follows: Section 2 introduces the pi-calculus; Section 3 takes a scenario from the classical paper on agent-oriented programming [Sh93] and translates it into a very simple subset of the n-calculus; Section 4 then shows how more sophisticated features of the pi-calculus may bc brought into play; Section 5 discusses how the pi-calculus may be used to define groups for epistemic logic; and Section 6 is the conclusion.
Brandt, Silke; Buttelmann, David; Lieven, Elena; Tomasello, Michael
2016-11-01
De Villiers (Lingua, 2007, Vol. 117, pp. 1858-1878) and others have claimed that children come to understand false belief as they acquire linguistic constructions for representing a proposition and the speaker's epistemic attitude toward that proposition. In the current study, English-speaking children of 3 and 4years of age (N=64) were asked to interpret propositional attitude constructions with a first- or third-person subject of the propositional attitude (e.g., "I think the sticker is in the red box" or "The cow thinks the sticker is in the red box", respectively). They were also assessed for an understanding of their own and others' false beliefs. We found that 4-year-olds showed a better understanding of both third-person propositional attitude constructions and false belief than their younger peers. No significant developmental differences were found for first-person propositional attitude constructions. The older children also showed a better understanding of their own false beliefs than of others' false beliefs. In addition, regression analyses suggest that the older children's comprehension of their own false beliefs was mainly related to their understanding of third-person propositional attitude constructions. These results indicate that we need to take a closer look at the propositional attitude constructions that are supposed to support children's false-belief reasoning. Children may come to understand their own and others' beliefs in different ways, and this may affect both their use and understanding of propositional attitude constructions and their performance in various types of false-belief tasks. Copyright © 2016 Elsevier Inc. All rights reserved.
Risk assessment for carcinogens under California's Proposition 65
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pease, W.S.; Zeise, L.; Kelter, A.
1990-06-01
Risk assessments for carcinogens are being developed through an accelerated process in California as a part of the state's implementation of Proposition 65, the Safe Drinking Water and Toxic Enforcement Act. Estimates of carcinogenic potency made by the California Department of Health Services (CDHS) are generally similar to estimates made by the U.S. Environmental Protection Agency (EPA). The largest differences are due to EPA's use of the maximum likelihood estimate instead of CDHS' use of the upper 95% confidence bounds on potencies derived from human data and to procedures used to correct for studies of short duration or with earlymore » mortality. Numerical limits derived from these potency estimates constitute no significant risk levels, which govern exemption from Proposition 65's discharge prohibition and warning requirements. Under Proposition 65 regulations, lifetime cancer risks less than 10(-5) are not significant and cumulative intake is not considered. Following these regulations, numerical limits for a number of Proposition 65 carcinogens that are applicable to the control of toxic discharges are less stringent than limits under existing federal water pollution control laws. Thus, existing federal limits will become the Proposition 65 levels for discharge. Chemicals currently not covered by federal and state controls will eventually be subject to discharge limitations under Proposition 65. No significant risk levels (expressed in terms of daily intake of carcinogens) also trigger warning requirements under Proposition 65 that are more extensive than existing state or federal requirements. A variety of chemical exposures from multiple sources are identified that exceed Proposition 65's no significant risk levels.« less
Some comments on Dr Iglesias's paper, 'In vitro fertilisation: the major issues'.
Mill, J M
1986-01-01
In an article in an earlier edition of the Journal of Medical Ethics (1) Dr Iglesias bases her analysis upon the mediaeval interpretation of Platonic metaphysics and Aristotelian logic as given by Aquinas. Propositional forms are applied to the analysis of experience. This results in a very abstract analysis. The essential connection of events and their changing temporal relationships are ignored. The dichotomy between body and soul is a central concept. The unchanging elements in experience are assumed to be more real than the actual world of experienced process. Such a view makes the analysis of the temporal factors in experience impossible. Its abstractness is quite unsuitable for the analysis of the ontological structure and development of the neonate from fertilisation to birth. A N Whitehead made the notion of organism central to his philosophy. He refused to place human experience outside nature, or admit dualism. His philosophy of organism is an attempt to uncover the essential elements connecting human experience with the physical and biological sciences. Time, change and process are, in his view, more real than the static abstractions obtainable by the use of the fallacy of misplaced concreteness. Use of the latter negates the essential connectedness of events and the importance of temporarily and change (2). In this paper I argue that the embryo, being an organism, is not analysable in terms of thinghood. It is a process. To apply Aristotelian logical concepts to it is to distort the real nature of the datum. PMID:3959039
A reconfigurable cryogenic platform for the classical control of quantum processors
NASA Astrophysics Data System (ADS)
Homulle, Harald; Visser, Stefan; Patra, Bishnu; Ferrari, Giorgio; Prati, Enrico; Sebastiano, Fabio; Charbon, Edoardo
2017-04-01
The implementation of a classical control infrastructure for large-scale quantum computers is challenging due to the need for integration and processing time, which is constrained by coherence time. We propose a cryogenic reconfigurable platform as the heart of the control infrastructure implementing the digital error-correction control loop. The platform is implemented on a field-programmable gate array (FPGA) that supports the functionality required by several qubit technologies and that can operate close to the physical qubits over a temperature range from 4 K to 300 K. This work focuses on the extensive characterization of the electronic platform over this temperature range. All major FPGA building blocks (such as look-up tables (LUTs), carry chains (CARRY4), mixed-mode clock manager (MMCM), phase-locked loop (PLL), block random access memory, and IDELAY2 (programmable delay element)) operate correctly and the logic speed is very stable. The logic speed of LUTs and CARRY4 changes less then 5%, whereas the jitter of MMCM and PLL clock managers is reduced by 20%. The stability is finally demonstrated by operating an integrated 1.2 GSa/s analog-to-digital converter (ADC) with a relatively stable performance over temperature. The ADCs effective number of bits drops from 6 to 4.5 bits when operating at 15 K.
A reconfigurable cryogenic platform for the classical control of quantum processors.
Homulle, Harald; Visser, Stefan; Patra, Bishnu; Ferrari, Giorgio; Prati, Enrico; Sebastiano, Fabio; Charbon, Edoardo
2017-04-01
The implementation of a classical control infrastructure for large-scale quantum computers is challenging due to the need for integration and processing time, which is constrained by coherence time. We propose a cryogenic reconfigurable platform as the heart of the control infrastructure implementing the digital error-correction control loop. The platform is implemented on a field-programmable gate array (FPGA) that supports the functionality required by several qubit technologies and that can operate close to the physical qubits over a temperature range from 4 K to 300 K. This work focuses on the extensive characterization of the electronic platform over this temperature range. All major FPGA building blocks (such as look-up tables (LUTs), carry chains (CARRY4), mixed-mode clock manager (MMCM), phase-locked loop (PLL), block random access memory, and IDELAY2 (programmable delay element)) operate correctly and the logic speed is very stable. The logic speed of LUTs and CARRY4 changes less then 5%, whereas the jitter of MMCM and PLL clock managers is reduced by 20%. The stability is finally demonstrated by operating an integrated 1.2 GSa/s analog-to-digital converter (ADC) with a relatively stable performance over temperature. The ADCs effective number of bits drops from 6 to 4.5 bits when operating at 15 K.
Comparison of four approaches to a rock facies classification problem
Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.
2007-01-01
In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.
The quality-value proposition in health care.
Feazell, G Landon; Marren, John P
2003-01-01
Powerful forces are converging in US health care to finally cause recognition of the inherently logical relationship between quality and money. The forces, or marketplace "drivers," which are converging to compel recognition of the relationship between cost and quality are: (1) the increasing costs of care; (2) the recurrence of another medical malpractice crisis; and (3) the recognition inside and outside of health care that quality is inconsistent and unacceptable. It is apparent that hospital administrators, financial officers, board members, and medical staff leadership do not routinely do two things: (1) relate quality to finance; and (2) appreciate the intra-hospital structural problems that impede quality attainment. This article discusses these factors and offers a positive method for re-structuring quality efforts and focusing the hospital and its medical staff on quality. The simple but compelling thesis of the authors is that health care must immediately engage in the transformation to making quality of medical care the fundamental business strategy of the organization.
Leary, Mark R
2004-05-01
By applying different standards of evidence to sociometer theory than to terror management theory (TMT), T. Pyszczynski, J. Greenberg, S. Solomon, J. Arndt, and J. Schimel's (2004) review offers an imbalanced appraisal of the theories' merits. Many of Pyszczynski et al.'s (2004) criticisms of sociometer theory apply equally to TMT. and others are based on misconstruals of the theory or misunderstandings regarding how people respond when rejected. Furthermore, much of their review is only indirectly relevant to TMT's position on the function of self-esteem, and the review fails to acknowledge logical and empirical challenges to TMT. A more balanced review suggests that each theory trumps the other in certain respects, both have difficulty explaining all of the evidence regarding self-esteem, and the propositions of each theory can be roughly translated into the concepts of the other. For these reasons, declaring a theoretical winner at this time is premature. ((c) 2004 APA, all rights reserved)
Lessons Learned from using a Livingstone Model to Diagnose a Main Propulsion System
NASA Technical Reports Server (NTRS)
Sweet, Adam; Bajwa, Anupa
2003-01-01
NASA researchers have demonstrated that qualitative, model-based reasoning can be used for fault detection in a Main Propulsion System (MPS), a complex, continuous system. At the heart of this diagnostic system is Livingstone, a discrete, propositional logic-based inference engine. Livingstone comprises a language for specifying a discrete model of the system and a set of algorithms that use the model to track the system's state. Livingstone uses the model to test assumptions about the state of a component - observations from the system are compared with values predicted by the model. The intent of this paper is to summarize some advantages of Livingstone seen through our modeling experience: for instance, flexibility in modeling, speed and maturity. We also describe some shortcomings we perceived in the implementation of Livingstone, such as modeling continuous dynamics and handling of transients. We list some upcoming enhancements to the next version of Livingstone that may resolve some of the current limitations.
Interval-type and affine arithmetic-type techniques for handling uncertainty in expert systems
NASA Astrophysics Data System (ADS)
Ceberio, Martine; Kreinovich, Vladik; Chopra, Sanjeev; Longpre, Luc; Nguyen, Hung T.; Ludascher, Bertram; Baral, Chitta
2007-02-01
Expert knowledge consists of statements Sj (facts and rules). The facts and rules are often only true with some probability. For example, if we are interested in oil, we should look at seismic data. If in 90% of the cases, the seismic data were indeed helpful in locating oil, then we can say that if we are interested in oil, then with probability 90% it is helpful to look at the seismic data. In more formal terms, we can say that the implication "if oil then seismic" holds with probability 90%. Another example: a bank A trusts a client B, so if we trust the bank A, we should trust B too; if statistically this trust was justified in 99% of the cases, we can conclude that the corresponding implication holds with probability 99%. If a query Q is deducible from facts and rules, what is the resulting probability p(Q) in Q? We can describe the truth of Q as a propositional formula F in terms of Sj, i.e., as a combination of statements Sj linked by operators like &, [logical or], and [not sign]; computing p(Q) exactly is NP-hard, so heuristics are needed. Traditionally, expert systems use technique similar to straightforward interval computations: we parse F and replace each computation step with corresponding probability operation. Problem: at each step, we ignore the dependence between the intermediate results Fj; hence intervals are too wide. Example: the estimate for P(A[logical or][not sign]A) is not 1. Solution: similar to affine arithmetic, besides P(Fj), we also compute P(Fj&Fi) (or P(Fj1&...&Fjd)), and on each step, use all combinations of l such probabilities to get new estimates. Results: e.g., P(A[logical or][not sign]A) is estimated as 1.
Fareed, Naleef; Mick, Stephen S
2011-01-01
For almost a decade, public and private organizations have pressured hospitals to improve their patient safety records. Since 2008, the Centers for Medicare & Medicaid Services has no longer been reimbursing hospitals for secondary diagnoses not reported during the point of admission. This ruling has motivated some hospitals to engage in safety-oriented programs to decrease adverse events. This study examined which hospitals may engage in patient safety solutions and whether they create these patient safety solutions within their structures or use suppliers in the market. We used a theoretical model that incorporates the key constructs of resource dependence theory and transaction cost economics theory to predict a hospital's reaction to Centers for Medicare & Medicaid Services "never event" regulations. We present propositions that speculate on how forces conceptualized from the resource dependence theory may affect adoption of patient safety innovations and, when they do, whether the adopting hospitals will do so internally or externally according to the transaction cost economics theory. On the basis of forces identified by the resource dependence theory, we predict that larger, teaching, safety net, horizontally integrated, highly interdependent, and public hospitals in concentrated, high public payer presence, competitive, and resource-rich environments will be more likely to engage in patient safety innovations. Following the logic of the transaction cost economics theory, we predict that of the hospitals that react positively to the never event regulation, most will internalize their innovations in patient safety solutions rather than approach the market, a choice that helps hospitals economize on transaction costs. This study helps hospital managers in their strategic thinking and planning in relation to current and future regulations related to patient safety. For researchers and policy analysts, our propositions provide the basis for empirical testing.
Parents, peer groups, and other socializing influences.
Vandell, D L
2000-11-01
Three propositions that are central to J. R. Harris's group socialization theory (1995, 1998) are considered in this review. These propositions are as follows: (a) Parental behaviors have no long-term effects on children's psychological characteristics, (b) peer groups are the primary environmental influence on psychological functioning, and (c) dyadic relationships are situation-specific and do not generalize. The evidence that J. R. Harris has outlined in support of each of these propositions is reviewed, as is additional empirical research not considered by J. R. Harris. Serious limitations to each proposition are identified. The available evidence is more consistent with a model of multiple socialization agents. An expanded research agenda that permits a more definitive test of J. R. Harris's propositions and social relationship theory is proposed.
Curriculum Design and Epistemic Ascent
ERIC Educational Resources Information Center
Winch, Christopher
2013-01-01
Three kinds of knowledge usually recognised by epistemologists are identified and their relevance for curriculum design is discussed. These are: propositional knowledge, know-how and knowledge by acquaintance. The inferential nature of propositional knowledge is argued for and it is suggested that propositional knowledge in fact presupposes the…
Participatory Democracy and Budgeting: The Effects of Proposition 13.
ERIC Educational Resources Information Center
McCaffery, Jerry; Bowman, John H.
1978-01-01
The complexities associated with Proposition 13 provide a lesson in the hazards of fiscal policy-making through direct voter participation. While the full effects of Proposition 13 are not yet known, it is clear that it has reshaped California local government finance overnight. (Author)
Logics for Coalgebras of Finitary Set Functors
NASA Astrophysics Data System (ADS)
Sprunger, David
In this thesis, we present a collection of results about coalgebras of finitary Set functors. Our chief contribution is a logic for behavioral equivalence for states in these coalgebras. This proof system is intended to formalize a common pattern of reasoning in the study of coalgebra commonly called proof by bisimulation or bisimulation up-to. The approach in this thesis combine these up-to techniques with a concept very close to bisimulation to show the proof system is sound and complete with respect to behavioral equivalence. Our second category of contributions revolves around applications of coalgebra to the study of sequences and power series. The culmination of this work is a new approach to Christol's Theorem, a classic result characterizing the algebraic power series in finite characteristic rings as those whose coefficients can be produced by finite automata.
Fragments of Science: Festschrift for Mendel Sachs
NASA Astrophysics Data System (ADS)
Ram, Michael
1999-11-01
The Table of Contents for the full book PDF is as follows: * Preface * Sketches at a Symposium * For Mendel Sachs * The Constancy of an Angular Point of View * Information-Theoretic Logic and Transformation-Theoretic Logic * The Invention of the Transistor and the Realization of the Hole * Mach's Principle, Newtonian Gravitation, Absolute Space, and Einstein * The Sun, Our Variable Star * The Inconstant Sun: Symbiosis of Time Variations of Sunspots, Atmospheric Radiocarbon, Aurorae, and Tree Ring Growth * Other Worlds * Super-Classical Quantum Mechanics * A Probabilistic Approach to the Phase Problem of X-Ray Crystallography * A Nonlinear Twist on Inertia Gives Unified Electroweak Gravitation * Neutrino Oscillations * On an Incompleteness in the General-Relativistic Description of Gravitation * All Truth is One * Ideas of Physics: Correspondence between Colleagues * The Influence of the Physics and Philosophy of Einstein's Relativity on My Attitudes in Science: An Autobiography
Szaciłowski, Konrad
2007-01-01
Analogies between photoactive nitric oxide generators and various electronic devices: logic gates and operational amplifiers are presented. These analogies have important biological consequences: application of control parameters allows for better targeting and control of nitric oxide drugs. The same methodology may be applied in the future for other therapeutic strategies and at the same time helps to understand natural regulatory and signaling processes in biological systems.
The "Is-Ought-Is" Problem of the Objective in Adult Education.
ERIC Educational Resources Information Center
Mattimore-Knudson, Russell S.
1982-01-01
Shows that adult education objectives must be evaluative propositions rather than descriptive propositions; as evaluative propositions they cannot be used as "evidence" to defend the cancellation or repetition of programs. Presents a possible solution to the "is-ought-is" dichotomy as it relates to the use of evaluative…
Some Propositions about Teaching and Learning.
ERIC Educational Resources Information Center
Hunter, Walter E., Comp.; And Others
Various propositions on college teaching and learning, established by the professor and graduate students in a course at the University of Florida, are presented. The importance of both the professional discipline and teaching components is stressed. The propositions are intended for graduate students to use as a resource of basic information…
Proposition 2 1/2: Explaining the Vote.
ERIC Educational Resources Information Center
Ladd, Helen F.; Wilson, Julie Boatright
Researchers examined Massachusetts voters' reactions to Proposition 2 1/2--which severely restricts local governments' ability to raise money for local public services--through a statewide telephone survey of 1,561 household heads in 58 towns. Data were gathered on each respondent's vote on the proposition, sex, age, education, occupation, income,…
NASA Technical Reports Server (NTRS)
Blados, Walter R.; Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.
1990-01-01
This paper formulates and studies two propositions. Proposition 1 states that information that is external to the aerospace organization tends to be used less than internal sources of information; the more geographically removed the information is from the organization, the less likely it is to be used. Proposition 2 states that of the various sociometric variables assumed to influence the use of an information channel or source, perceived accessibility exerts the greatest influence. Preliminary analysis based on surveys supports Proposition 1. This analysis does not support Proposition 2, however. Evidence here indicates that reliability and relevance influence the use of an information source more than the idea of perceived accessibility.
California's tobacco tax initiative: the development and passage of Proposition 99.
Traynor, M P; Glantz, S A
1996-01-01
In this case study, we describe and analyze the development and passage of California's tobacco tax initiative, Proposition 99, the Tobacco Tax and Health Promotion Act of 1988. We gathered information from published reports, public documents, personal correspondence, internal memorandums, polling data, and interviews with representatives from organizations that participated in the Proposition 99 campaign. Proposition 99 passed as a result of the efforts of a coalition of voluntary health agencies, medical organizations, and environmental groups. They organized a long-term effort by conducting essential polling, planning strategies, gaining media exposure, developing a coalition, and running a successful campaign to enact the tax by shifting the venue from legislative to initiative politics. To build the coalition that was needed to pass Proposition 99, public health proponents enlisted the help of medical organizations in exchange for additional revenue to be allocated to medical services. By shifting the venue from the legislature to the general public, advocates capitalized on public concern about tobacco and for youth and took advantage of the tobacco industry's low credibility. The passage of Proposition 99, despite a massive campaign against it by the tobacco industry, represents a milestone in the tobacco control and public health fields. From its passage in 1988 through 1993, tobacco use in California declined by 27 percent, which is three times faster than the United States average. As a result, Proposition 99 has served as a national model for other states and the federal government. Although allocation of tobacco tax revenues specifically to health education and prevention was a primary goal during the development and passage of Proposition 99, when the venue shifted back to the legislature for implementation, medical organizations successfully advocated illegal diversions of Proposition 99 tobacco control and research funds to medical services. Organizations seeking to enact Proposition 99-like tobacco tax increases must be prepared to mount aggressive campaigns to pass the initiative in the face of major tobacco industry opposition and then must continue to work to protect the program after passage by voters.
Dynamic Simulation of Human Gait Model With Predictive Capability.
Sun, Jinming; Wu, Shaoli; Voglewede, Philip A
2018-03-01
In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.
Quantum Vertex Model for Reversible Classical Computing
NASA Astrophysics Data System (ADS)
Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng
We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.
Proposition 187: An Effective Measure To Deter Undocumented Migration to California?
ERIC Educational Resources Information Center
Alarcon, Rafael
In 1994, California voters approved Proposition 187, which prohibits provision of publicly funded education and social services to undocumented immigrants, and which requires public schools to verify the legal status of students and their parents. This paper examines socioeconomic and immigration trends leading to the emergence of Proposition 187,…
Exploring Task- and Student-Related Factors in the Method of Propositional Manipulation (MPM)
ERIC Educational Resources Information Center
Leppink, Jimmie; Broers, Nick J.; Imbos, Tjaart; van der Vleuten, Cees P. M.; Berger, Martijn P. F.
2011-01-01
The method of propositional manipulation (MPM) aims to help students develop conceptual understanding of statistics by guiding them into self-explaining propositions. To explore task- and student-related factors influencing students' ability to learn from MPM, twenty undergraduate students performed six learning tasks while thinking aloud. The…
Memorisation Methods in Science Education: Tactics to Improve the Teaching and Learning Practice
ERIC Educational Resources Information Center
Pals, Frits F. B.; Tolboom, Jos L. J.; Suhre, Cor J. M.; van Geert, Paul L. C.
2018-01-01
How can science teachers support students in developing an appropriate declarative knowledge base for solving problems? This article focuses on the question whether the development of students' memory of scientific propositions is better served by writing propositions down on paper or by making drawings of propositions either by silent or…
Associating versus Proposing or Associating What We Propose: Comment on Gawronski and Bodenhausen
ERIC Educational Resources Information Center
Albarracin, Dolores; Hart, William; McCulloch, Kathleen C.
2006-01-01
This commentary on the article by B. Gawronski and G. V. Bodenhausen (see record 2006-10465-003) highlights the strengths of the associative-propositional evaluation model. It then describes problems in proposing a qualitative separation between propositional and associative processes. Propositional processes are instead described as associative.…
Predictors of Short-Term Treatment Outcomes among California's Proposition 36 Participants
ERIC Educational Resources Information Center
Hser, Yih-Ing; Evans, Elizabeth; Teruya, Cheryl; Huang, David; Anglin, M. Douglas
2007-01-01
California's voter-initiated Proposition 36 offers non-violent drug offenders community-based treatment as an alternative to incarceration or probation without treatment. This article reports short-term treatment outcomes subsequent to this major shift in drug policy. Data are from 1104 individuals randomly selected from all Proposition 36…
The Effects of Proposition 209 on California: Higher Education, Public Employment, and Contracting
ERIC Educational Resources Information Center
Geshekter, Charles L.
2008-01-01
In 1996, Californians overwhelmingly approved Proposition 209, which prohibited all state agencies from discriminating on the basis of race, ethnicity, or gender in university admissions, public employment, or competition for a state contract. Opponents of Proposition 209 predicted dire consequences for California's ethnic minorities and women if…
Has California's Passage of Proposition 227 Made a Difference in the Way We Teach?
ERIC Educational Resources Information Center
Arellano-Houchin, Anna; Flamenco, Claudia; Merlos, Moises M.; Segura, Lorena
2001-01-01
Examined how teachers were impacted by California's Proposition 227, highlighting changes in teaching styles and beliefs about the proposition and its effectiveness. Teachers had to change their teaching strategies to accommodate the new curriculum. They were not sufficiently trained for immediate implementation of English-only education. Teaching…
DOT National Transportation Integrated Search
2008-04-01
The objective of Task 2 is to identify the combination of value propositions that is : believed to be achievable by 2030 and collectively hold promise for a sustainable : PHEV market by 2030. This deliverable outlines what the project team (with inpu...
Rethinking the Value Proposition to Improve Teaching Effectiveness. Rethinking Teacher Compensation
ERIC Educational Resources Information Center
Shields, Regis Anne; Lewis, Christopher
2012-01-01
All employers, including school districts, enter into a "Value Proposition" with their employees--the complete set of offerings and experiences provided by the employer, compared to other similar opportunities. A successful Value Proposition reflects the needs of both employer and employee, not only attracting and retaining employees with the…
Environment and initial state engineered dynamics of quantum and classical correlations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Cheng-Zhi, E-mail: czczwang@outlook.com; Li, Chun-Xian; Guo, Yu
Based on an open exactly solvable system coupled to an environment with nontrivial spectral density, we connect the features of quantum and classical correlations with some features of the environment, initial states of the system, and the presence of initial system–environment correlations. Some interesting features not revealed before are observed by changing the structure of environment, the initial states of system, and the presence of initial system–environment correlations. The main results are as follows. (1) Quantum correlations exhibit temporary freezing and permanent freezing even at high temperature of the environment, for which the necessary and sufficient conditions are given bymore » three propositions. (2) Quantum correlations display a transition from temporary freezing to permanent freezing by changing the structure of environment. (3) Quantum correlations can be enhanced all the time, for which the condition is put forward. (4) The one-to-one dependency relationship between all kinds of dynamic behaviors of quantum correlations and the initial states of the system as well as environment structure is established. (5) In the presence of initial system–environment correlations, quantum correlations under local environment exhibit temporary multi-freezing phenomenon. While under global environment they oscillate, revive, and damp, an explanation for which is given. - Highlights: • Various interesting behaviors of quantum and classical correlations are observed in an open exactly solvable model. • The important effects of the bath structure on quantum and classical correlations are revealed. • The one-to-one correspondence between the type of dynamical behavior of quantum discord and the initial state is given. • Quantum correlations are given in the presence of initial qubits–bath correlations.« less
The universal numbers. From Biology to Physics.
Marchal, Bruno
2015-12-01
I will explain how the mathematicians have discovered the universal numbers, or abstract computer, and I will explain some abstract biology, mainly self-reproduction and embryogenesis. Then I will explain how and why, and in which sense, some of those numbers can dream and why their dreams can glue together and must, when we assume computationalism in cognitive science, generate a phenomenological physics, as part of a larger phenomenological theology (in the sense of the greek theologians). The title should have been "From Biology to Physics, through the Phenomenological Theology of the Universal Numbers", if that was not too long for a title. The theology will consist mainly, like in some (neo)platonist greek-indian-chinese tradition, in the truth about numbers' relative relations, with each others, and with themselves. The main difference between Aristotle and Plato is that Aristotle (especially in its common and modern christian interpretation) makes reality WYSIWYG (What you see is what you get: reality is what we observe, measure, i.e. the natural material physical science) where for Plato and the (rational) mystics, what we see might be only the shadow or the border of something else, which might be non physical (mathematical, arithmetical, theological, …). Since Gödel, we know that Truth, even just the Arithmetical Truth, is vastly bigger than what the machine can rationally justify. Yet, with Church's thesis, and the mechanizability of the diagonalizations involved, machines can apprehend this and can justify their limitations, and get some sense of what might be true beyond what they can prove or justify rationally. Indeed, the incompleteness phenomenon introduces a gap between what is provable by some machine and what is true about that machine, and, as Gödel saw already in 1931, the existence of that gap is accessible to the machine itself, once it is has enough provability abilities. Incompleteness separates truth and provable, and machines can justify this in some way. More importantly incompleteness entails the distinction between many intensional variants of provability. For example, the absence of reflexion (beweisbar(⌜A⌝) → A with beweisbar being Gödel's provability predicate) makes it impossible for the machine's provability to obey the axioms usually taken for a theory of knowledge. The most important consequence of this in the machine's possible phenomenology is that it provides sense, indeed arithmetical sense, to intensional variants of provability, like the logics of provability-and-truth, which at the propositional level can be mirrored by the logic of provable-and-true statements (beweisbar(⌜A⌝) ∧ A). It is incompleteness which makes this logic different from the logic of provability. Other variants, like provable-and-consistent, or provable-and-consistent-and-true, appears in the same way, and inherits the incompleteness splitting, unlike beweisbar(⌜A⌝) ∧ A. I will recall thought experience which motivates the use of those intensional variants to associate a knower and an observer in some canonical way to the machines or the numbers. We will in this way get an abstract and phenomenological theology of a machine M through the true logics of their true self-referential abilities (even if not provable, or knowable, by the machine itself), in those different intensional senses. Cognitive science and theoretical physics motivate the study of those logics with the arithmetical interpretation of the atomic sentences restricted to the "verifiable" (Σ1) sentences, which is the way to study the theology of the computationalist machine. This provides a logic of the observable, as expected by the Universal Dovetailer Argument, which will be recalled briefly, and which can lead to a comparison of the machine's logic of physics with the empirical logic of the physicists (like quantum logic). This leads also to a series of open problems. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Paredes, Sara Micaela
2000-01-01
Interviews and observations of a first-grade mathematics class to determine the influence of Proposition 227 on limited-English-speaking students found that many initial fears about Proposition 227 were unfounded; student ability to cope was underestimated; teacher knowledge of Spanish and Latino culture was critical; and students' limited English…
California's Proposition 227: Implications and Costs of the Unz Initiative.
ERIC Educational Resources Information Center
Council of the Great City Schools, Washington, DC.
Voters in California will vote June 2, 1998 to decide the fate of Proposition 227, a measure proposed by businessman Ron Unz that would substantially change the way that students who are not proficient in English are taught. If approved by the voters, Proposition 227, the Unz Initiative, would essentially eliminate bilingual education programs in…
ERIC Educational Resources Information Center
Nieuwland, Mante S.; Martin, Andrea E.
2012-01-01
Propositional truth-value can be a defining feature of a sentence's relevance to the unfolding discourse, and establishing propositional truth-value in context can be key to successful interpretation. In the current study, we investigate its role in the comprehension of counterfactual conditionals, which describe imaginary consequences of…
Parallel database search and prime factorization with magnonic holographic memory devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khitun, Alexander
In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploitmore » wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.« less
Wu, Li; Ren, Jinsong; Qu, Xiaogang
2014-01-01
Nucleic acids have become a powerful tool in nanotechnology because of their controllable diverse conformational transitions and adaptable higher-order nanostructure. Using single-stranded DNA probes as the pore-caps for various target recognition, here we present an ultrasensitive universal electrochemical detection system based on graphene and mesoporous silica, and achieve sensitivity with all of the major classes of analytes and simultaneously realize DNA logic gate operations. The concept is based on the locking of the pores and preventing the signal-reporter molecules from escape by target-induced the conformational change of the tailored DNA caps. The coupling of ‘waking up’ gatekeeper with highly specific biochemical recognition is an innovative strategy for the detection of various targets, able to compete with classical methods which need expensive instrumentation and sophisticated experimental operations. The present study has introduced a new electrochemical signal amplification concept and also adds a new dimension to the function of graphene-mesoporous materials hybrids as multifunctional nanoscale logic devices. More importantly, the development of this approach would spur further advances in important areas, such as point-of-care diagnostics or detection of specific biological contaminations, and hold promise for use in field analysis. PMID:25249622
Parallel database search and prime factorization with magnonic holographic memory devices
NASA Astrophysics Data System (ADS)
Khitun, Alexander
2015-12-01
In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploit wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.
Earthquake Archaeology: a logical approach?
NASA Astrophysics Data System (ADS)
Stewart, I. S.; Buck, V. A.
2001-12-01
Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.
Quantum-like Modeling of Cognition
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2015-09-01
This paper begins with a historical review of the mutual influence of physics and psychology, from Freud's invention of psychic energy inspired by von Boltzmann' thermodynamics to the enrichment quantum physics gained from the side of psychology by the notion of complementarity (the invention of Niels Bohr who was inspired by William James), besides we consider the resonance of the correspondence between Wolfgang Pauli and Carl Jung in both physics and psychology. Then we turn to the problem of development of mathematical models for laws of thought starting with Boolean logic and progressing towards foundations of classical probability theory. Interestingly, the laws of classical logic and probability are routinely violated not only by quantum statistical phenomena but by cognitive phenomena as well. This is yet another common feature between quantum physics and psychology. In particular, cognitive data can exhibit a kind of the probabilistic interference effect. This similarity with quantum physics convinced a multi-disciplinary group of scientists (physicists, psychologists, economists, sociologists) to apply the mathematical apparatus of quantum mechanics to modeling of cognition. We illustrate this activity by considering a few concrete phenomena: the order and disjunction effects, recognition of ambiguous figures, categorization-decision making. In Appendix 1 we briefly present essentials of theory of contextual probability and a method of representations of contextual probabilities by complex probability amplitudes (solution of the ``inverse Born's problem'') based on a quantum-like representation algorithm (QLRA).
Selective correlations in finite quantum systems and the Desargues property
NASA Astrophysics Data System (ADS)
Lei, C.; Vourdas, A.
2018-06-01
The Desargues property is well known in the context of projective geometry. An analogous property is presented in the context of both classical and Quantum Physics. In a classical context, the Desargues property implies that two logical circuits with the same input show in their outputs selective correlations. In general their outputs are uncorrelated, but if the output of one has a particular value, then the output of the other has another particular value. In a quantum context, the Desargues property implies that two experiments each of which involves two successive projective measurements have selective correlations. For a particular set of projectors, if in one experiment the second measurement does not change the output of the first measurement, then the same is true in the other experiment.
Scalable digital hardware for a trapped ion quantum computer
NASA Astrophysics Data System (ADS)
Mount, Emily; Gaultney, Daniel; Vrijsen, Geert; Adams, Michael; Baek, So-Young; Hudek, Kai; Isabella, Louis; Crain, Stephen; van Rynbach, Andre; Maunz, Peter; Kim, Jungsang
2016-12-01
Many of the challenges of scaling quantum computer hardware lie at the interface between the qubits and the classical control signals used to manipulate them. Modular ion trap quantum computer architectures address scalability by constructing individual quantum processors interconnected via a network of quantum communication channels. Successful operation of such quantum hardware requires a fully programmable classical control system capable of frequency stabilizing the continuous wave lasers necessary for loading, cooling, initialization, and detection of the ion qubits, stabilizing the optical frequency combs used to drive logic gate operations on the ion qubits, providing a large number of analog voltage sources to drive the trap electrodes, and a scheme for maintaining phase coherence among all the controllers that manipulate the qubits. In this work, we describe scalable solutions to these hardware development challenges.
Reproducibility in Psychological Science: When Do Psychological Phenomena Exist?
Iso-Ahola, Seppo E.
2017-01-01
Scientific evidence has recently been used to assert that certain psychological phenomena do not exist. Such claims, however, cannot be made because (1) scientific method itself is seriously limited (i.e., it can never prove a negative); (2) non-existence of phenomena would require a complete absence of both logical (theoretical) and empirical support; even if empirical support is weak, logical and theoretical support can be strong; (3) statistical data are only one piece of evidence and cannot be used to reduce psychological phenomena to statistical phenomena; and (4) psychological phenomena vary across time, situations and persons. The human mind is unreproducible from one situation to another. Psychological phenomena are not particles that can decisively be tested and discovered. Therefore, a declaration that a phenomenon is not real is not only theoretically and empirically unjustified but runs counter to the propositional and provisional nature of scientific knowledge. There are only “temporary winners” and no “final truths” in scientific knowledge. Psychology is a science of subtleties in human affect, cognition and behavior. Its phenomena fluctuate with conditions and may sometimes be difficult to detect and reproduce empirically. When strictly applied, reproducibility is an overstated and even questionable concept in psychological science. Furthermore, statistical measures (e.g., effect size) are poor indicators of the theoretical importance and relevance of phenomena (cf. “deliberate practice” vs. “talent” in expert performance), not to mention whether phenomena are real or unreal. To better understand psychological phenomena, their theoretical and empirical properties should be examined via multiple parameters and criteria. Ten such parameters are suggested. PMID:28626435
The effect of emotion on interpretation and logic in a conditional reasoning task.
Blanchette, Isabelle
2006-07-01
The effect of emotional content on logical reasoning is explored in three experiments. Theparticipants completed a conditional reasoning task (If p, then q) with emotional and neutral contents. In Experiment 1, existing emotional and neutral words were used. The emotional value of initially neutral words was experimentally manipulated in Experiments 1B and 2, using classical conditioning. In all experiments, participants were less likely to provide normatively correct answers when reasoning about emotional stimuli, compared with neutral stimuli. This was true for both negative (Experiments 1B and 2) and positive contents (Experiment 2). The participants' interpretations of the conditional statements were also measured (perceived sufficiency, necessity, causality, and plausibility). The results showed the expected relationship between interpretation and reasoning. However, emotion did not affect interpretation. Emotional and neutral conditional statements were interpreted similarly. The results are discussed in light of current models of emotion and reasoning.
Performance of Quantum Annealers on Hard Scheduling Problems
NASA Astrophysics Data System (ADS)
Pokharel, Bibek; Venturelli, Davide; Rieffel, Eleanor
Quantum annealers have been employed to attack a variety of optimization problems. We compared the performance of the current D-Wave 2X quantum annealer to that of the previous generation D-Wave Two quantum annealer on scheduling-type planning problems. Further, we compared the effect of different anneal times, embeddings of the logical problem, and different settings of the ferromagnetic coupling JF across the logical vertex-model on the performance of the D-Wave 2X quantum annealer. Our results show that at the best settings, the scaling of expected anneal time to solution for D-WAVE 2X is better than that of the DWave Two, but still inferior to that of state of the art classical solvers on these problems. We discuss the implication of our results for the design and programming of future quantum annealers. Supported by NASA Ames Research Center.
Hybrid genetic algorithm in the Hopfield network for maximum 2-satisfiability problem
NASA Astrophysics Data System (ADS)
Kasihmuddin, Mohd Shareduwan Mohd; Sathasivam, Saratha; Mansor, Mohd. Asyraf
2017-08-01
Heuristic method was designed for finding optimal solution more quickly compared to classical methods which are too complex to comprehend. In this study, a hybrid approach that utilizes Hopfield network and genetic algorithm in doing maximum 2-Satisfiability problem (MAX-2SAT) was proposed. Hopfield neural network was used to minimize logical inconsistency in interpretations of logic clauses or program. Genetic algorithm (GA) has pioneered the implementation of methods that exploit the idea of combination and reproduce a better solution. The simulation incorporated with and without genetic algorithm will be examined by using Microsoft Visual 2013 C++ Express software. The performance of both searching techniques in doing MAX-2SAT was evaluate based on global minima ratio, ratio of satisfied clause and computation time. The result obtained form the computer simulation demonstrates the effectiveness and acceleration features of genetic algorithm in doing MAX-2SAT in Hopfield network.
Patel, Raj B; Ho, Joseph; Ferreyrol, Franck; Ralph, Timothy C; Pryde, Geoff J
2016-03-01
Minimizing the resources required to build logic gates into useful processing circuits is key to realizing quantum computers. Although the salient features of a quantum computer have been shown in proof-of-principle experiments, difficulties in scaling quantum systems have made more complex operations intractable. This is exemplified in the classical Fredkin (controlled-SWAP) gate for which, despite theoretical proposals, no quantum analog has been realized. By adding control to the SWAP unitary, we use photonic qubit logic to demonstrate the first quantum Fredkin gate, which promises many applications in quantum information and measurement. We implement example algorithms and generate the highest-fidelity three-photon Greenberger-Horne-Zeilinger states to date. The technique we use allows one to add a control operation to a black-box unitary, something that is impossible in the standard circuit model. Our experiment represents the first use of this technique to control a two-qubit operation and paves the way for larger controlled circuits to be realized efficiently.
Single-photon non-linear optics with a quantum dot in a waveguide
NASA Astrophysics Data System (ADS)
Javadi, A.; Söllner, I.; Arcari, M.; Hansen, S. Lindskov; Midolo, L.; Mahmoodian, S.; Kiršanskė, G.; Pregnolato, T.; Lee, E. H.; Song, J. D.; Stobbe, S.; Lodahl, P.
2015-10-01
Strong non-linear interactions between photons enable logic operations for both classical and quantum-information technology. Unfortunately, non-linear interactions are usually feeble and therefore all-optical logic gates tend to be inefficient. A quantum emitter deterministically coupled to a propagating mode fundamentally changes the situation, since each photon inevitably interacts with the emitter, and highly correlated many-photon states may be created. Here we show that a single quantum dot in a photonic-crystal waveguide can be used as a giant non-linearity sensitive at the single-photon level. The non-linear response is revealed from the intensity and quantum statistics of the scattered photons, and contains contributions from an entangled photon-photon bound state. The quantum non-linearity will find immediate applications for deterministic Bell-state measurements and single-photon transistors and paves the way to scalable waveguide-based photonic quantum-computing architectures.
Modeling uncertainty in computerized guidelines using fuzzy logic.
Jaulent, M. C.; Joyaux, C.; Colombet, I.; Gillois, P.; Degoulet, P.; Chatellier, G.
2001-01-01
Computerized Clinical Practice Guidelines (CPGs) improve quality of care by assisting physicians in their decision making. A number of problems emerges since patients with close characteristics are given contradictory recommendations. In this article, we propose to use fuzzy logic to model uncertainty due to the use of thresholds in CPGs. A fuzzy classification procedure has been developed that provides for each message of the CPG, a strength of recommendation that rates the appropriateness of the recommendation for the patient under consideration. This work is done in the context of a CPG for the diagnosis and the management of hypertension, published in 1997 by the French agency ANAES. A population of 82 patients with mild to moderate hypertension was selected and the results of the classification system were compared to whose given by a classical decision tree. Observed agreement is 86.6% and the variability of recommendations for patients with close characteristics is reduced. PMID:11825196
Trapped-Ion Quantum Logic with Global Radiation Fields.
Weidt, S; Randall, J; Webster, S C; Lake, K; Webb, A E; Cohen, I; Navickas, T; Lekitsch, B; Retzker, A; Hensinger, W K
2016-11-25
Trapped ions are a promising tool for building a large-scale quantum computer. However, the number of required radiation fields for the realization of quantum gates in any proposed ion-based architecture scales with the number of ions within the quantum computer, posing a major obstacle when imagining a device with millions of ions. Here, we present a fundamentally different approach for trapped-ion quantum computing where this detrimental scaling vanishes. The method is based on individually controlled voltages applied to each logic gate location to facilitate the actual gate operation analogous to a traditional transistor architecture within a classical computer processor. To demonstrate the key principle of this approach we implement a versatile quantum gate method based on long-wavelength radiation and use this method to generate a maximally entangled state of two quantum engineered clock qubits with fidelity 0.985(12). This quantum gate also constitutes a simple-to-implement tool for quantum metrology, sensing, and simulation.
Rossi, Sandrine; Cassotti, Mathieu; Moutier, Sylvain; Delcroix, Nicolas; Houdé, Olivier
2015-01-01
Reasoners make systematic logical errors by giving heuristic responses that reflect deviations from the logical norm. Influential studies have suggested first that our reasoning is often biased because we minimize cognitive effort to surpass a cognitive conflict between heuristic response from system 1 and analytic response from system 2 thinking. Additionally, cognitive control processes might be necessary to inhibit system 1 responses to activate a system 2 response. Previous studies have shown a significant effect of executive learning (EL) on adults who have transferred knowledge acquired on the Wason selection task (WST) to another isomorphic task, the rule falsification task (RFT). The original paradigm consisted of teaching participants to inhibit a classical matching heuristic that sufficed the first problem and led to significant EL transfer on the second problem. Interestingly, the reasoning tasks differed in inhibiting-heuristic metacognitive cost. Success on the WST requires half-suppression of the matching elements. In contrast, the RFT necessitates a global rejection of the matching elements for a correct answer. Therefore, metacognitive learning difficulty most likely differs depending on whether one uses the first or second task during the learning phase. We aimed to investigate this difficulty and various matching-bias inhibition effects in a new (reversed) paradigm. In this case, the transfer effect from the RFT to the WST could be more difficult because the reasoner learns to reject all matching elements in the first task. We observed that the EL leads to a significant reduction in matching selections on the WST without increasing logical performances. Interestingly, the acquired metacognitive knowledge was too "strictly" transferred and discouraged matching rather than encouraging logic. This finding underlines the complexity of learning transfer and adds new evidence to the pedagogy of reasoning.
Rossi, Sandrine; Cassotti, Mathieu; Moutier, Sylvain; Delcroix, Nicolas; Houdé, Olivier
2015-01-01
Reasoners make systematic logical errors by giving heuristic responses that reflect deviations from the logical norm. Influential studies have suggested first that our reasoning is often biased because we minimize cognitive effort to surpass a cognitive conflict between heuristic response from system 1 and analytic response from system 2 thinking. Additionally, cognitive control processes might be necessary to inhibit system 1 responses to activate a system 2 response. Previous studies have shown a significant effect of executive learning (EL) on adults who have transferred knowledge acquired on the Wason selection task (WST) to another isomorphic task, the rule falsification task (RFT). The original paradigm consisted of teaching participants to inhibit a classical matching heuristic that sufficed the first problem and led to significant EL transfer on the second problem. Interestingly, the reasoning tasks differed in inhibiting-heuristic metacognitive cost. Success on the WST requires half-suppression of the matching elements. In contrast, the RFT necessitates a global rejection of the matching elements for a correct answer. Therefore, metacognitive learning difficulty most likely differs depending on whether one uses the first or second task during the learning phase. We aimed to investigate this difficulty and various matching-bias inhibition effects in a new (reversed) paradigm. In this case, the transfer effect from the RFT to the WST could be more difficult because the reasoner learns to reject all matching elements in the first task. We observed that the EL leads to a significant reduction in matching selections on the WST without increasing logical performances. Interestingly, the acquired metacognitive knowledge was too “strictly” transferred and discouraged matching rather than encouraging logic. This finding underlines the complexity of learning transfer and adds new evidence to the pedagogy of reasoning. PMID:25849555
ERIC Educational Resources Information Center
Rodda, Albert S.
In fall 1978, Paul Gann, who worked with Howard Jarvis to pass California's Proposition 13 in June 1978, sought to qualify an intitiative placing a constitutional limit on state and local government expenditures. This initiative qualified and was approved by voters in November 1979 as Proposition 4. Gann's solicitation set the limitation's base…
Varying Use of Conceptual Metaphors across Levels of Expertise in Thermodynamics
NASA Astrophysics Data System (ADS)
Jeppsson, Fredrik; Haglund, Jesper; Amin, Tamer G.
2015-04-01
Many studies have previously focused on how people with different levels of expertise solve physics problems. In early work, focus was on characterising differences between experts and novices and a key finding was the central role that propositionally expressed principles and laws play in expert, but not novice, problem-solving. A more recent line of research has focused on characterising continuity between experts and novices at the level of non-propositional knowledge structures and processes such as image-schemas, imagistic simulation and analogical reasoning. This study contributes to an emerging literature addressing the coordination of both propositional and non-propositional knowledge structures and processes in the development of expertise. Specifically, in this paper, we compare problem-solving across two levels of expertise-undergraduate students of chemistry and Ph.D. students in physical chemistry-identifying differences in how conceptual metaphors (CMs) are used (or not) to coordinate propositional and non-propositional knowledge structures in the context of solving problems on entropy. It is hypothesised that the acquisition of expertise involves learning to coordinate the use of CMs to interpret propositional (linguistic and mathematical) knowledge and apply it to specific problem situations. Moreover, we suggest that with increasing expertise, the use of CMs involves a greater degree of subjective engagement with physical entities and processes. Implications for research on learning and instructional practice are discussed. Third contribution to special issue entitled: Conceptual metaphor and embodied cognition in science learning
Nursing intellectual capital theory: testing selected propositions.
Covell, Christine L; Sidani, Souraya
2013-11-01
To test the selected propositions of the middle-range theory of nursing intellectual capital. The nursing intellectual capital theory conceptualizes nursing knowledge's influence on patient and organizational outcomes. The theory proposes nursing human capital, nurses' knowledge, skills and experience, is related to the quality of patient care and nurse recruitment and retention of an inpatient care unit. Two factors in the work environment, nurse staffing and employer support for nurse continuing professional development, are proposed to influence nursing human capital's association with patient and organizational outcomes. A cross-sectional survey design. The study took place in 2008 in six Canadian acute care hospitals. Financial, human resource and risk data were collected from hospital departments and unit managers. Clearly specified empirical indicators quantified the study variables. The propositions of the theory were tested with data from 91 inpatient care units using structural equation modelling. The propositions associated with the nursing human capital concept were supported. The propositions associated with the employer support for nurse continuing professional development concept were not. The proposition that nurse staffing's influences on patient outcomes was mediated by the nursing human capital of an inpatient unit, was partially supported. Some of the theory's propositions were empirically validated. Additional theoretical work is needed to refine the operationalization and measurement of some of the theory's concepts. Further research with larger samples of data from different geographical settings and types of hospitals is required to determine if the theory can withstand empirical scrutiny. © 2013 Blackwell Publishing Ltd.
Quantum Tic-Tac-Toe as Metaphor for Quantum Physics
NASA Astrophysics Data System (ADS)
Goff, Allan; Lehmann, Dale; Siegel, Joel
2004-02-01
Quantum Tic-Tac-Toe is presented as an abstract quantum system derived from the rules of Classical Tic-Tac-Toe. Abstract quantum systems can be constructed from classical systems by the addition of three types of rules; rules of Superposition, rules of Entanglement, and rules of Collapse. This is formally done for Quantum Tic-Tac-Toe. As a part of this construction it is shown that abstract quantum systems can be viewed as an ensemble of classical systems. That is, the state of a quantum game implies a set of simultaneous classical games. The number and evolution of the ensemble of classical games is driven by the superposition, entanglement, and collapse rules. Various aspects and play situations provide excellent metaphors for standard features of quantum mechanics. Several of the more significant metaphors are discussed, including a measurement mechanism, the correspondence principle, Everett's Many Worlds Hypothesis, an ascertainity principle, and spooky action at a distance. Abstract quantum systems also show the consistency of backwards-in-time causality, and the influence on the present of both pasts and futures that never happened. The strongest logical argument against faster-than-light (FTL) phenomena is that since FTL implies backwards-in-time causality, temporal paradox is an unavoidable consequence of FTL; hence FTL is impossible. Since abstract quantum systems support backwards-in-time causality but avoid temporal paradox through pruning of the classical ensemble, it may be that quantum based FTL schemes are possible allowing backwards-in-time causality, but prohibiting temporal paradox.
ERIC Educational Resources Information Center
Feather, Denis
2015-01-01
This paper offers an alternative proposition to that of Lewis on identity and professional identity in higher education (HE). The proposition is provided from the narratives of 26 individual interviewees who deliver HE in college-based higher education, a viewpoint not considered by Lewis, who tends to adopt a more generalist view. Where Lewis…
ERIC Educational Resources Information Center
Krawczyk, Elizabeth A.
2012-01-01
Evidentiality has usually been defined as the grammaticalized expression of a speaker's evidence source for a proposition, where "evidence" is conceptualized as a speaker's source-type for a particular proposition (Aikhenvald 2004). How this evidence source-type and the evidential are related has yet to be formally modeled in…
Proposition 2 1/2: Variations in Individual Preferences and Expectations across Communities.
ERIC Educational Resources Information Center
Ladd, Helen F.; Wilson, Julie Boatright
This paper uses data from a large statewide survey of Massachusetts residents to measure support for Proposition 2 1/2. Proposition 2 1/2 required high tax rate communities to reduce property tax levies 15 percent per year until the tax rate is reduced to the maximum allowable rate of 2 1/2 percent of full and fair market value. Specifically, this…
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Elshall, A. S.; Hanor, J. S.
2012-12-01
Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.
A beginner's guide to belief revision and truth maintenance systems
NASA Technical Reports Server (NTRS)
Mason, Cindy L.
1992-01-01
This brief note is intended to familiarize the non-TMS audience with some of the basic ideas surrounding classic TMS's (truth maintenance systems), namely the justification-based TMS and the assumption-based TMS. Topics of further interest include the relation between non-monotonic logics and TMS's, efficiency and search issues, complexity concerns, as well as the variety of TMS systems that have surfaced in the past decade or so. These include probabilistic-based TMS systems, fuzzy TMS systems, tri-valued belief systems, and so on.
The nature of crime : Is cheating necessary for cooperation?
Machalek, R; Cohen, L E
1991-09-01
The classical social theorist Emile Durkheim proposed the counterintuitive thesis that crime is beneficial for society because it provokes punishment, which enhances social solidarity. His logic, however, is blemished by a reified view of society that leads to group-selectionist thinking and a teleological account of the causes of crime. Reconceptualization of the relationship between crime and punishment in terms of evolutionary game theory, however, suggests that crime (cheating) may confer benefits on cooperating individuals by promoting stability in their patterns of cooperation.
Formal verification of mathematical software
NASA Technical Reports Server (NTRS)
Sutherland, D.
1984-01-01
Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.
Workforce Professionalism in Drug Treatment Services: Impact of California’s Proposition 36
Wu, Fei; Hser, Yih-Ing
2011-01-01
This article investigates whether California’s Proposition 36 has promoted the workforce professionalism of drug treatment services during its first five years of implementation. Program surveys inquiring about organizational information, Proposition 36 implementation, and staffing were conducted in 2003 and 2005 among all treatment providers serving Proposition 36 clients in five selected California counties (San Diego, Riverside, Kern, Sacramento, and San Francisco). A one-hour self-administered questionnaire was completed by 118 treatment providers representing 102 programs. This article examines five topics that are relevant to drug treatment workforce professionalism: resources and capability, standardized intake assessment and outcome evaluation, staff qualification, program accreditation, and information technology. Results suggest that Proposition 36 had a positive influence on the drug treatment workforce’s professionalism. Improvements have been observed in program resources, client intake assessment and outcome evaluation databases, staff professionalization, program accreditation, and information technology system. However, some areas remain problematic, including, for example, the consistent lack of adequate resources serving women with children. PMID:21036513
Reconstructing Root. An argument for objectivity.
Mathieson, I
2001-10-01
Substantial evidence suggests that complete rejection of the Root model may be premature, given the inherent logic in an Aristotelian interpretation of its core philosophy. Although critics have focused on a Platonic interpretation of Root's criteria for normalcy and surrounding theories, recent theoretic shifts towards a more flexible view of the factors which can combine to produce pathologic conditions suggest that the Root model retains usefulness. Although it has been suggested that Kuhn's approach may contain a destructive element, one of its propositions--that a phase of normal science is characterized by a common vision of the research required within the paradigm--seems to hold the key to the future success of podiatric biomechanics. The approach of Lakatos seems to provide the required "modicum of self-confidence which enables us to live and practice" to smooth the transition between established and emergent approaches. Although the approaches of Kuhn and Lakatos remain incommensurable, it is certain that Kuhn would agree with one particularly relevant comment by Lakatos, that "blind commitment to a theory is not an intellectual virtue: it is an intellectual crime."
Ethics, economics, and public financing of health care.
Hurley, J
2001-08-01
There is a wide variety of ethical arguments for public financing of health care that share a common structure built on a series of four logically related propositions regarding: (1) the ultimate purpose of a human life or human society; (2) the role of health and its distribution in society in advancing this ultimate purpose; (3) the role of access to or utilisation of health care in maintaining or improving the desired level and distribution of health among members of society, and (4) the role of public financing in ensuring the ethically justified access to and utilisation of health care by members of society. This paper argues that economics has much to contribute to the development of the ethical foundations for publicly financed health care. It focuses in particular on recent economic work to clarify the concepts of access and need and their role in analyses of the just distribution of health care resources, and on the importance of economic analysis of health care and health care insurance markets in demonstrating why public financing is necessary to achieve broad access to and utilisation of health care services.
The neural basis of conditional reasoning with arbitrary content.
Noveck, Ira A; Goel, Vinod; Smith, Kathleen W
2004-01-01
Behavioral predictions about reasoning have usually contrasted two accounts, Mental Logic and Mental Models. Neuroimaging techniques have been providing new measures that transcend this debate. We tested a hypothesis from Goel and Dolan (2003) that predicts neural activity predominantly in a left parietal-frontal system when participants reason with arbitrary (non-meaningful) materials. In an event-related fMRI investigation, we employed propositional syllogisms, the majority of which involved conditional reasoning. While investigating conditional reasoning generally, we ultimately focused on the neural activity linked to the two valid conditional forms--Modus Ponens (If p then q; p//q) and Modus Tollens (If p then q; not-q//not-p). Consistent with Goel and Dolan (2003), we found a left lateralized parietal frontal network for both inference forms with increasing activation when reasoning becomes more challenging by way of Modus Tollens. These findings show that the previous findings with more complex Aristotlean syllogisms are robust and cast doubt upon accounts of reasoning that accord primary inferential processes uniquely to either the right hemisphere or to language areas.
Howes-Mischel, Rebecca
2016-06-01
This article examines how amplified fetal heartbeats may be used to make claims about fetuses' social presence. These claims are supported by the Mexican Public Health system's selection of the maternal-child relationship as a key site of clinical intervention, intertwining medical and moral discourses. Drawing on the robust literature on cross-cultural propositions of "fetal personhood," this analysis uses ethnographic material from public health institutions in Oaxaca, Mexico, to explore how doctors use diagnostic technology to materialize fetuses for their patients. I argue that Spanish's epistemological distinction between saber (to have knowledge about) and conocer (to be acquainted with) is key to how diagnostic technologies may be deployed to make social claims. I use one doctor's attempts to use technology to shift her patient from saber to conocer as illustrative of underlying cultural logics about fetal embodiment and its proof. Focused on the under-theorized socio-medical deployment of audio fetal heartbeat technology, this article suggests that sound-in addition to sight-is a potent tool for constructing fetal personhood. © 2016 by the American Anthropological Association.
ERIC Educational Resources Information Center
Friedman, Mark
As part of a series of reports designed to support the implementation of Proposition 10: The California Children and Families Act and to provide comprehensive and authoritative information on critical issues concerning young children and families in California, this report addresses how Proposition 10 Commissions can organize their work and their…
Recapitalization and Acquisition of Light Tactical Wheeled Vehicles (REDACTED)
2010-01-29
representative from Red River Army Depot in Texarkana , Texas,18 stated that recapitalizing current HMMWVs to the XM1166 model was an excellent proposition...Red River Army Depot in Texarkana , Texas, stated that recapitalizing current HMMWVs to the XM1166 model was an excellent proposition. The Deputy...Army Depot in Texarkana , Texas, stated that recapitalizing current HMMWVs to the XM1166 model was an excellent proposition because the U.S
Practical issues in quantum-key-distribution postprocessing
NASA Astrophysics Data System (ADS)
Fung, Chi-Hang Fred; Ma, Xiongfeng; Chau, H. F.
2010-01-01
Quantum key distribution (QKD) is a secure key generation method between two distant parties by wisely exploiting properties of quantum mechanics. In QKD, experimental measurement outcomes on quantum states are transformed by the two parties to a secret key. This transformation is composed of many logical steps (as guided by security proofs), which together will ultimately determine the length of the final secret key and its security. We detail the procedure for performing such classical postprocessing taking into account practical concerns (including the finite-size effect and authentication and encryption for classical communications). This procedure is directly applicable to realistic QKD experiments and thus serves as a recipe that specifies what postprocessing operations are needed and what the security level is for certain lengths of the keys. Our result is applicable to the BB84 protocol with a single or entangled photon source.
The changing features of the body-mind problem.
Agassi, Joseph
2007-01-01
The body-mind problem invites scientific study, since mental events are repeated and repeatable and invite testable explanations. They seemed troublesome because of the classical theory of substance that failed to solve its own central problems. These are soluble with the aid of the theory of the laws of nature, particularly in its emergentist version [Bunge, M., 1980. The Body-mind Problem, Pergamon, Oxford] that invites refutable explanations [Popper, K.R., 1959. The Logic of Scientific Discovery, Hutchinson, London]. The view of mental properties as emergent is a modification of the two chief classical views, materialism and dualism. As this view invites testable explanations of events of the inner world, it is better than the quasi-behaviorist view of self-awareness as computer-style self-monitoring [Minsky, M., Laske, O., 1992. A conversation with Marvin Minsky. AI Magazine 13 (3), 31-45].
Simple dissociations for a higher-powered neuropsychology.
McIntosh, Robert D
2018-06-01
Dissociations in cognitive neuropsychology are often investigated at the level of the single-case, and formal criteria exist for the detection of dissociations, and their sub-classification into 'classical' and 'strong' types. These criteria require a patient to show a frank deficit on one task (for a classical dissociation) or both tasks (for a strong dissociation), and a significantly extreme difference between tasks. I propose that only the significant between-task difference is logically necessary, and that if this simple criterion is met, the patient should be said to show a dissociation. Using Monte Carlo simulations, I show that this simplification increases the power to detect dissociations across a range of practically-relevant conditions, whilst retaining excellent control over Type I error. Additional testing for frank deficits on each task provides further qualifying information, but using these test outcomes to categorise dissociations as classical or strong may be too uncertain to guide theoretical inferences reliably. I suggest that we might instead characterise the strength of the dissociation using a continuous index, such as the effect size of the between-task difference. Copyright © 2018 Elsevier Ltd. All rights reserved.
Kendon, Vivien M; Nemoto, Kae; Munro, William J
2010-08-13
We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.
Using Correlation to Compute Better Probability Estimates in Plan Graphs
NASA Technical Reports Server (NTRS)
Bryce, Daniel; Smith, David E.
2006-01-01
Plan graphs are commonly used in planning to help compute heuristic "distance" estimates between states and goals. A few authors have also attempted to use plan graphs in probabilistic planning to compute estimates of the probability that propositions can be achieved and actions can be performed. This is done by propagating probability information forward through the plan graph from the initial conditions through each possible action to the action effects, and hence to the propositions at the next layer of the plan graph. The problem with these calculations is that they make very strong independence assumptions - in particular, they usually assume that the preconditions for each action are independent of each other. This can lead to gross overestimates in probability when the plans for those preconditions interfere with each other. It can also lead to gross underestimates of probability when there is synergy between the plans for two or more preconditions. In this paper we introduce a notion of the binary correlation between two propositions and actions within a plan graph, show how to propagate this information within a plan graph, and show how this improves probability estimates for planning. This notion of correlation can be thought of as a continuous generalization of the notion of mutual exclusion (mutex) often used in plan graphs. At one extreme (correlation=0) two propositions or actions are completely mutex. With correlation = 1, two propositions or actions are independent, and with correlation > 1, two propositions or actions are synergistic. Intermediate values can and do occur indicating different degrees to which propositions and action interfere or are synergistic. We compare this approach with another recent approach by Bryce that computes probability estimates using Monte Carlo simulation of possible worlds in plan graphs.
Perspective: Memcomputing: Leveraging memory and physics to compute efficiently
NASA Astrophysics Data System (ADS)
Di Ventra, Massimiliano; Traversa, Fabio L.
2018-05-01
It is well known that physical phenomena may be of great help in computing some difficult problems efficiently. A typical example is prime factorization that may be solved in polynomial time by exploiting quantum entanglement on a quantum computer. There are, however, other types of (non-quantum) physical properties that one may leverage to compute efficiently a wide range of hard problems. In this perspective, we discuss how to employ one such property, memory (time non-locality), in a novel physics-based approach to computation: Memcomputing. In particular, we focus on digital memcomputing machines (DMMs) that are scalable. DMMs can be realized with non-linear dynamical systems with memory. The latter property allows the realization of a new type of Boolean logic, one that is self-organizing. Self-organizing logic gates are "terminal-agnostic," namely, they do not distinguish between the input and output terminals. When appropriately assembled to represent a given combinatorial/optimization problem, the corresponding self-organizing circuit converges to the equilibrium points that express the solutions of the problem at hand. In doing so, DMMs take advantage of the long-range order that develops during the transient dynamics. This collective dynamical behavior, reminiscent of a phase transition, or even the "edge of chaos," is mediated by families of classical trajectories (instantons) that connect critical points of increasing stability in the system's phase space. The topological character of the solution search renders DMMs robust against noise and structural disorder. Since DMMs are non-quantum systems described by ordinary differential equations, not only can they be built in hardware with the available technology, they can also be simulated efficiently on modern classical computers. As an example, we will show the polynomial-time solution of the subset-sum problem for the worst cases, and point to other types of hard problems where simulations of DMMs' equations of motion on classical computers have already demonstrated substantial advantages over traditional approaches. We conclude this article by outlining further directions of study.
A Global Perspective on Religious Participation and Suicide.
Hsieh, Ning
2017-09-01
Although sociological research in the Durkheimian tradition has generally accepted that religious involvement protects against suicide, few studies have examined this theoretical proposition outside Western industrialized settings. Using multilevel models to analyze data from the World Health Organization Mortality Database and the World Values Survey (1981-2007) across 42 countries in seven geographical-cultural regions, this study explores whether religious participation is more protective against suicide in some regions than others and, if so, why. Results indicate that while religious participation is protective in Latin America, eastern Europe, northern Europe, and English-speaking countries, it may aggravate the risk of suicide in East Asia, western Europe, and southern Europe. This regional variation is the result of differences in both the degree of integration/regulation of religious communities and suicide underreporting. Overall, the findings support the network perspective of Durkheim's classical theory and suggest that researchers should be more cautious about suicide underreporting in less industrialized settings.
Dirac strings and magnetic monopoles in the spin ice Dy2Ti2O7.
Morris, D J P; Tennant, D A; Grigera, S A; Klemke, B; Castelnovo, C; Moessner, R; Czternasty, C; Meissner, M; Rule, K C; Hoffmann, J-U; Kiefer, K; Gerischer, S; Slobinsky, D; Perry, R S
2009-10-16
Sources of magnetic fields-magnetic monopoles-have so far proven elusive as elementary particles. Condensed-matter physicists have recently proposed several scenarios of emergent quasiparticles resembling monopoles. A particularly simple proposition pertains to spin ice on the highly frustrated pyrochlore lattice. The spin-ice state is argued to be well described by networks of aligned dipoles resembling solenoidal tubes-classical, and observable, versions of a Dirac string. Where these tubes end, the resulting defects look like magnetic monopoles. We demonstrated, by diffuse neutron scattering, the presence of such strings in the spin ice dysprosium titanate (Dy2Ti2O7). This is achieved by applying a symmetry-breaking magnetic field with which we can manipulate the density and orientation of the strings. In turn, heat capacity is described by a gas of magnetic monopoles interacting via a magnetic Coulomb interaction.
How the intentions of the draftsman shape perception of a drawing.
Pignocchi, Alessandro
2010-12-01
The interaction between the recovery of the artist's intentions and the perception of an artwork is a classic topic for philosophy and history of art. It also frequently, albeit sometimes implicitly, comes up in everyday thought and conversation about art and artworks. Since recent work in cognitive science can help us understand how we perceive and understand the intentions of others, this discipline could fruitfully participate in a multidisciplinary investigation of the role of intention recovery in art perception. The method I propose is to look for cases where recovery of the artist's intentions interacts with perception of a work of art, and this cannot be explain by a simple top-down influence of conscious propositional knowledge on perception. I will focus on drawing and show that recovery of the draftsman's intentional actions is handled by a psychological process shaped by the motor system of the observer. Copyright © 2010 Elsevier Inc. All rights reserved.
Political and Legal Responses to Proposition 13 in California
1980-01-01
govern- ments. iii SUMMARY The passage of Proposition 13, the Jarvis-Gann initiative, by a 2-to- l margin was heralded by some as the cutting...perience with the passage of Proposition 13 in 1978 and the Gann initiative in 1979, it is likely that this ballot measure will pass. l Then local...91 I. INTRODUCTION June 6, 1978, marked the start of what some saw as a national tax revolt. By a 2-to- l margin
NASA Astrophysics Data System (ADS)
Neuville, R.; Pouliot, J.; Poux, F.; Hallot, P.; De Rudder, L.; Billen, R.
2017-10-01
This paper deals with the establishment of a comprehensive methodological framework that defines 3D visualisation rules and its application in a decision support tool. Whilst the use of 3D models grows in many application fields, their visualisation remains challenging from the point of view of mapping and rendering aspects to be applied to suitability support the decision making process. Indeed, there exists a great number of 3D visualisation techniques but as far as we know, a decision support tool that facilitates the production of an efficient 3D visualisation is still missing. This is why a comprehensive methodological framework is proposed in order to build decision tables for specific data, tasks and contexts. Based on the second-order logic formalism, we define a set of functions and propositions among and between two collections of entities: on one hand static retinal variables (hue, size, shape…) and 3D environment parameters (directional lighting, shadow, haze…) and on the other hand their effect(s) regarding specific visual tasks. It enables to define 3D visualisation rules according to four categories: consequence, compatibility, potential incompatibility and incompatibility. In this paper, the application of the methodological framework is demonstrated for an urban visualisation at high density considering a specific set of entities. On the basis of our analysis and the results of many studies conducted in the 3D semiotics, which refers to the study of symbols and how they relay information, the truth values of propositions are determined. 3D visualisation rules are then extracted for the considered context and set of entities and are presented into a decision table with a colour coding. Finally, the decision table is implemented into a plugin developed with three.js, a cross-browser JavaScript library. The plugin consists of a sidebar and warning windows that help the designer in the use of a set of static retinal variables and 3D environment parameters.
NASA Astrophysics Data System (ADS)
Sun, Xiaoqiang; Cai, Yingfeng; Wang, Shaohua; Liu, Yanling; Chen, Long
2016-01-01
The control problems associated with vehicle height adjustment of electronically controlled air suspension (ECAS) still pose theoretical challenges for researchers, which manifest themselves in the publications on this subject over the last years. This paper deals with modeling and control of a vehicle height adjustment system for ECAS, which is an example of a hybrid dynamical system due to the coexistence and coupling of continuous variables and discrete events. A mixed logical dynamical (MLD) modeling approach is chosen for capturing enough details of the vehicle height adjustment process. The hybrid dynamic model is constructed on the basis of some assumptions and piecewise linear approximation for components nonlinearities. Then, the on-off statuses of solenoid valves and the piecewise approximation process are described by propositional logic, and the hybrid system is transformed into the set of linear mixed-integer equalities and inequalities, denoted as MLD model, automatically by HYSDEL. Using this model, a hybrid model predictive controller (HMPC) is tuned based on online mixed-integer quadratic optimization (MIQP). Two different scenarios are considered in the simulation, whose results verify the height adjustment effectiveness of the proposed approach. Explicit solutions of the controller are computed to control the vehicle height adjustment system in realtime using an offline multi-parametric programming technology (MPT), thus convert the controller into an equivalent explicit piecewise affine form. Finally, bench experiments for vehicle height lifting, holding and lowering procedures are conducted, which demonstrate that the HMPC can adjust the vehicle height by controlling the on-off statuses of solenoid valves directly. This research proposes a new modeling and control method for vehicle height adjustment of ECAS, which leads to a closed-loop system with favorable dynamical properties.
Beginning Typewriting: A Fifty-Fifty Proposition
ERIC Educational Resources Information Center
Ivarie, Ted
1976-01-01
Beginning typewriting should be a 50-50 proposition with equal time devoted to machine operation and skill development and to language arts instruction in elementary and secondary education. (Author/LH)
Outcomes in a Sample of Opiod-Dependent Clients Treated Under California's Proposition 36.
Chun, Jongserl; Guydish, Joseph R; Sorensen, James L; Haug, Nancy A; Andrews, Siara; Nelson, Larry
2007-07-01
This study evaluated treatment outcomes for the reduction of criminal justice involvement and substance use among opioid dependent clients in a therapeutic community setting under California's Proposition 36. We compared treatment outcomes between those mandated to treatment under Proposition 36 (n = 24) and those on probation but not involved in Proposition 36 (n = 61) over 12 months. Over time, both groups showed significant improvement on drug use and employment measures, were more likely to be involved in job training and less likely to be engaged in work activity, and had similar retention in treatment. There was no evidence that treatment outcomes were different between the two groups. These findings may be helpful in guiding policy makers and clinicians in states where similar initiatives are under consideration.
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
Entangled Parametric Hierarchies: Problems for an Overspecified Universal Grammar
Boeckx, Cedric; Leivada, Evelina
2013-01-01
This study addresses the feasibility of the classical notion of parameter in linguistic theory from the perspective of parametric hierarchies. A novel program-based analysis is implemented in order to show certain empirical problems related to these hierarchies. The program was developed on the basis of an enriched data base spanning 23 contemporary and 5 ancient languages. The empirical issues uncovered cast doubt on classical parametric models of language acquisition as well as on the conceptualization of an overspecified Universal Grammar that has parameters among its primitives. Pinpointing these issues leads to the proposal that (i) the (bio)logical problem of language acquisition does not amount to a process of triggering innately pre-wired values of parameters and (ii) it paves the way for viewing language, epigenetic (‘parametric’) variation as an externalization-related epiphenomenon, whose learning component may be more important than what sometimes is assumed. PMID:24019867
Experimental realization of a one-way quantum computer algorithm solving Simon's problem.
Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G
2014-11-14
We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.
Patel, Raj B.; Ho, Joseph; Ferreyrol, Franck; Ralph, Timothy C.; Pryde, Geoff J.
2016-01-01
Minimizing the resources required to build logic gates into useful processing circuits is key to realizing quantum computers. Although the salient features of a quantum computer have been shown in proof-of-principle experiments, difficulties in scaling quantum systems have made more complex operations intractable. This is exemplified in the classical Fredkin (controlled-SWAP) gate for which, despite theoretical proposals, no quantum analog has been realized. By adding control to the SWAP unitary, we use photonic qubit logic to demonstrate the first quantum Fredkin gate, which promises many applications in quantum information and measurement. We implement example algorithms and generate the highest-fidelity three-photon Greenberger-Horne-Zeilinger states to date. The technique we use allows one to add a control operation to a black-box unitary, something that is impossible in the standard circuit model. Our experiment represents the first use of this technique to control a two-qubit operation and paves the way for larger controlled circuits to be realized efficiently. PMID:27051868
NASA Technical Reports Server (NTRS)
Passman, Stephen L.
1989-01-01
Generally, two types of theory are used to describe the field equations for suspensions. The so-called postulated equations are based on the kinetic theory of mixtures, which logically should give reasonable equations for solutions. The basis for the use of such theory for suspensions is tenuous, though it at least gives a logical path for mathematical arguments. It has the disadvantage that it leads to a system of equations which is underdetermined, in a sense that can be made precise. On the other hand, the so-called averaging theory starts with a determined system, but the very process of averaging renders the resulting system underdetermined. A third type of theory is proposed in which the kinetic theory of gases is used to motivate continuum equations for the suspended particles. This entails an interpretation of the stress in the particles that is different from the usual one. Classical theory is used to describe the motion of the suspending medium. The result is a determined system for a dilute suspension. Extension of the theory to more concentrated systems is discussed.
The perils of the imperfect expectation of the perfect baby.
Chervenak, Frank A; McCullough, Laurence B; Brent, Robert L
2010-08-01
Advances in modern medicine invite the assumption that medicine can control human biology. There is a perilous logic that leads from expectations of medicine's control over reproductive biology to the expectation of having a perfect baby. This article proposes that obstetricians should take a preventive ethics approach to the care of pregnant women with expectations for a perfect baby. We use Nathaniel Hawthorne's classic short story, "The Birthmark," to illustrate the perils of the logic of control and perfection through science and then identify possible contemporary sources of the expectation of the perfect baby. We propose that the informed consent process should be used as a preventive ethics tool throughout the course of pregnancy to educate pregnant women about the inherent errors of human reproduction, the highly variable clinical outcomes of these errors, the limited capacity of medicine to detect these errors, and the even more limited capacity to correct them. Copyright (c) 2010 Mosby, Inc. All rights reserved.
Equivalence principle for quantum systems: dephasing and phase shift of free-falling particles
NASA Astrophysics Data System (ADS)
Anastopoulos, C.; Hu, B. L.
2018-02-01
We ask the question of how the (weak) equivalence principle established in classical gravitational physics should be reformulated and interpreted for massive quantum objects that may also have internal degrees of freedom (dof). This inquiry is necessary because even elementary concepts like a classical trajectory are not well defined in quantum physics—trajectories originating from quantum histories become viable entities only under stringent decoherence conditions. From this investigation we posit two logically and operationally distinct statements of the equivalence principle for quantum systems. Version A: the probability distribution of position for a free-falling particle is the same as the probability distribution of a free particle, modulo a mass-independent shift of its mean. Version B: any two particles with the same velocity wave-function behave identically in free fall, irrespective of their masses. Both statements apply to all quantum states, including those without a classical correspondence, and also for composite particles with quantum internal dof. We also investigate the consequences of the interaction between internal and external dof induced by free fall. For a class of initial states, we find dephasing occurs for the translational dof, namely, the suppression of the off-diagonal terms of the density matrix, in the position basis. We also find a gravitational phase shift in the reduced density matrix of the internal dof that does not depend on the particle’s mass. For classical states, the phase shift has a natural classical interpretation in terms of gravitational red-shift and special relativistic time-dilation.
A technology mapping based on graph of excitations and outputs for finite state machines
NASA Astrophysics Data System (ADS)
Kania, Dariusz; Kulisz, Józef
2017-11-01
A new, efficient technology mapping method of FSMs, dedicated for PAL-based PLDs is proposed. The essence of the method consists in searching for the minimal set of PAL-based logic blocks that cover a set of multiple-output implicants describing the transition and output functions of an FSM. The method is based on a new concept of graph: the Graph of Excitations and Outputs. The proposed algorithm was tested using the FSM benchmarks. The obtained results were compared with the classical technology mapping of FSM.
Robust Control Analysis of Hydraulic Turbine Speed
NASA Astrophysics Data System (ADS)
Jekan, P.; Subramani, C.
2018-04-01
An effective control strategy for the hydro-turbine governor in time scenario is adjective for this paper. Considering the complex dynamic characteristic and the uncertainty of the hydro-turbine governor model and taking the static and dynamic performance of the governing system as the ultimate goal, the designed logic combined the classical PID control theory with artificial intelligence used to obtain the desired output. The used controller will be a variable control techniques, therefore, its parameters can be adaptively adjusted according to the information about the control error signal.
An efficient quantum circuit analyser on qubits and qudits
NASA Astrophysics Data System (ADS)
Loke, T.; Wang, J. B.
2011-10-01
This paper presents a highly efficient decomposition scheme and its associated Mathematica notebook for the analysis of complicated quantum circuits comprised of single/multiple qubit and qudit quantum gates. In particular, this scheme reduces the evaluation of multiple unitary gate operations with many conditionals to just two matrix additions, regardless of the number of conditionals or gate dimensions. This improves significantly the capability of a quantum circuit analyser implemented in a classical computer. This is also the first efficient quantum circuit analyser to include qudit quantum logic gates.
Comparing Future Options for Human Space Flight
NASA Technical Reports Server (NTRS)
Sherwood, Brent
2010-01-01
The paper analyzes the "value proposition" for government-funded human space flight, a vexing question that persistently dogs efforts to justify its $10(exp 10)/year expense in the U.S. The original Mercury/Gemini/Apollo value proposition is not valid today. Neither was it the value proposition actually promoted by von Braun, which the post-Apollo 80% of human space flight history has persistently attempted to fulfill. Divergent potential objectives for human space flight are captured in four strategic options - Explore Mars; accelerate Space Passenger Travel; enable Space Power for Earth; and Settle the Moon - which are then analyzed for their Purpose, societal Myth, Legacy benefits, core Needs, and result as measured by the number and type of humans they would fly in space. This simple framework is proposed as a way to support productive dialogue with public and other stakeholders, to determine a sustainable value proposition for human space flight.
Walsh, Clare R; Johnson-Laird, P N
2009-07-01
When individuals detect an inconsistency in a set of propositions, they tend to change their minds about at least one proposition to resolve the inconsistency. The orthodox view from William James (1907) onward has been that a rational change should be minimal. We propose an alternative hypothesis according to which individuals seek to resolve inconsistencies by explaining their origins. We report four experiments corroborating the explanatory hypothesis. Experiment 1 showed that participants' explanations revised general conditional claims rather than specific categorical propositions. Experiment 2 showed that, when explanations did revise the categorical proposition, participants also tended to deny the consequences of a second generalization. Experiment 3 showed that this tendency persists when participants previously affirmed these consequences explicitly. Experiment 4 showed that, when participants could easily explain an inconsistency by revising a generalization, they were more likely to accept the consequences of a second generalization. All four results contravene minimalism but support the explanatory hypothesis.
PROMISES THEY CAN KEEP: LOW-INCOME WOMEN’S ATTITUDES TOWARD MOTHERHOOD, MARRIAGE, AND DIVORCE
Cherlin, Andrew; Cross-Barnet, Caitlin; Burton, Linda M.; Garrett-Peters, Raymond
2009-01-01
Using survey data on low-income mothers in Boston, Chicago, and San Antonio (n = 1,722) supplemented with ethnographic data, we test 3 propositions regarding mothers’ attitudes toward childbearing, marriage, and divorce. These are drawn from Edin & Kefalas (2005) but have also arisen in other recent studies. We find strong support for the proposition that childbearing outside of marriage carries little stigma, limited support for the proposition that women prefer to have children well before marrying, and almost no support for the proposition that women hesitate to marry because they fear divorce. We suggest that mothers’ attitudes and preferences in these 3 domains do not support the long delay between childbearing and marriage that has been noted in the literature. Throughout, we are able to study attitudes among several Hispanic groups as well as among African Americans and non-Hispanic Whites. PMID:19885381
Comparing future options for human space flight
NASA Astrophysics Data System (ADS)
Sherwood, Brent
2011-09-01
The paper analyzes the "value proposition" for government-funded human space flight, a vexing question that persistently dogs efforts to justify its $10 10/year expense in the US. The original Mercury/Gemini/Apollo value proposition is not valid today. Neither was it the value proposition actually promoted by von Braun, which the post-Apollo 80% of human space flight history has persistently attempted to fulfill. Divergent potential objectives for human space flight are captured in four strategic options— Explore Mars; accelerate Space Passenger Travel; enable Space Power for Earth; and Settle the Moon—which are then analyzed for their purpose, societal myth, legacy benefits, core needs, and result as measured by the number and type of humans they would fly in space. This simple framework is proposed as a way to support productive dialog with public and other stakeholders, to determine a sustainable value proposition for human space flight.
Consumer demand as a driver of improved working conditions: the 'Ergo-Brand' proposition.
Neumann, W Patrick; Dixon, Shane M; Nordvall, Anna-Carin
2014-01-01
This article develops and explores the 'Ergo-Brand' proposition, which posits that consumers may prefer to buy goods that are made under good working conditions (GWCs). This preference would enhance a differentiation strategy for companies, thereby fostering the application of ergonomics in production. This proposition is developed in the context of a narrative review of the literature on 'ethical consumerism'. This is supplemented with a survey study, conducted in both Canada and Sweden (n = 141) to explore this proposition. Results indicate that consumers would prefer goods made under GWCs, but not unconditionally as quality and price concerns were ranked higher. Access to information on the working conditions in production was seen as a barrier. Nevertheless, the Ergo-Brand concept may be a viable avenue in promoting attention towards ergonomics in companies - particularly if consumer habits are subject to intervention by advertising. Further research on this strategy is warranted.
Fuzzy and process modelling of contour ridge water dynamics
NASA Astrophysics Data System (ADS)
Mhizha, Alexander; Ndiritu, John
2018-05-01
Contour ridges are an in-situ rainwater harvesting technology developed initially for soil erosion control but are currently also widely promoted for rainwater harvesting. The effectiveness of contour ridges depends on geophysical, hydro-climatic and socio economic factors that are highly varied in time and space. Furthermore, field-scale data on these factors are often unavailable. This together with the complexity of hydrological processes at field scale limits the application of classical distributed process modelling to highly-instrumented experimental fields. This paper presents a framework that combines fuzzy logic and process-based approach for modelling contour ridges for rainwater harvesting where detailed field data are not available. Water balance for a representative contour-ridged field incorporating the water flow processes across the boundaries is integrated with fuzzy logic to incorporate the uncertainties in estimating runoff. The model is tested using data collected during the 2009/2010 and 2010/2011 rainfall seasons from two contour-ridged fields in Zhulube located in the semi-arid parts of Zimbabwe. The model is found to replicate soil moisture in the root zone reasonably well (NSE = 0.55 to 0.66 and PBIAS = -1.3 to 6.1 %). The results show that combining fuzzy logic and process based approaches can adequately model soil moisture in a contour ridged-field and could help to assess the water dynamics in contour ridged fields.
Customer value propositions in business markets.
Anderson, James C; Narus, James A; van Rossum, Wouter
2006-03-01
Examples of consumer value propositions that resonate with customers are exceptionally difficult to find. When properly constructed, value propositions force suppliers to focus on what their offerings are really worth. Once companies become disciplined about understanding their customers, they can make smarter choices about where to allocate scarce resources. The authors illuminate the pitfalls of current approaches, then present a systematic method for developing value propositions that are meaningful to target customers and that focus suppliers' efforts on creating superior value. When managers construct a customer value proposition, they often simply list all the benefits their offering might deliver. But the relative simplicity of this all-benefits approach may have a major drawback: benefit assertion. In other words, managers may claim advantages for features their customers don't care about in the least. Other suppliers try to answer the question, Why should our firm purchase your offering instead of your competitor's? But without a detailed understanding of the customer's requirements and preferences, suppliers can end up stressing points of difference that deliver relatively little value to the target customer. The pitfall with this approach is value presumption: assuming that any favorable points of difference must be valuable for the customer. Drawing on the best practices of a handful of suppliers in business markets, the authors advocate a resonating focus approach. Suppliers can provide simple, yet powerfully captivating, consumer value propositions by making their offerings superior on the few elements that matter most to target customers, demonstrating and documenting the value of this superior performance, and communicating it in a way that conveys a sophisticated understanding of the customer's business priorities.
Indoor Air Quality & Preventive Maintenance Value Proposition Worksheet
Part of our outreach and education to our webinar registrants is providing them a copy of the Value Proposition Worksheet which accompanies the webinar topic. For this particular webinar the topic is focused on IAQ and Preventive Maintenance.
Commonalities between Perception and Cognition.
Tacca, Michela C
2011-01-01
Perception and cognition are highly interrelated. Given the influence that these systems exert on one another, it is important to explain how perceptual representations and cognitive representations interact. In this paper, I analyze the similarities between visual perceptual representations and cognitive representations in terms of their structural properties and content. Specifically, I argue that the spatial structure underlying visual object representation displays systematicity - a property that is considered to be characteristic of propositional cognitive representations. To this end, I propose a logical characterization of visual feature binding as described by Treisman's Feature Integration Theory and argue that systematicity is not only a property of language-like representations, but also of spatially organized visual representations. Furthermore, I argue that if systematicity is taken to be a criterion to distinguish between conceptual and non-conceptual representations, then visual representations, that display systematicity, might count as an early type of conceptual representations. Showing these analogies between visual perception and cognition is an important step toward understanding the interface between the two systems. The ideas here presented might also set the stage for new empirical studies that directly compare binding (and other relational operations) in visual perception and higher cognition.
Commonalities between Perception and Cognition
Tacca, Michela C.
2011-01-01
Perception and cognition are highly interrelated. Given the influence that these systems exert on one another, it is important to explain how perceptual representations and cognitive representations interact. In this paper, I analyze the similarities between visual perceptual representations and cognitive representations in terms of their structural properties and content. Specifically, I argue that the spatial structure underlying visual object representation displays systematicity – a property that is considered to be characteristic of propositional cognitive representations. To this end, I propose a logical characterization of visual feature binding as described by Treisman’s Feature Integration Theory and argue that systematicity is not only a property of language-like representations, but also of spatially organized visual representations. Furthermore, I argue that if systematicity is taken to be a criterion to distinguish between conceptual and non-conceptual representations, then visual representations, that display systematicity, might count as an early type of conceptual representations. Showing these analogies between visual perception and cognition is an important step toward understanding the interface between the two systems. The ideas here presented might also set the stage for new empirical studies that directly compare binding (and other relational operations) in visual perception and higher cognition. PMID:22144974
Control and accountability in the NHS market: a practical proposition or logical impossibility?
Glynn, J J; Perkins, D
1998-01-01
Before the imposition of the NHS internal market, systems of accountability and control were far from adequate and could be criticized on a number of grounds. The market was offered as a panacea to address these inadequacies. However, in practice there have only been partial improvements which could have been achieved without the imposition of the market. The market also creates new problems and a number of crises and scandals seem to be addressed at the political level by pleas to utilize resources more effectively. These pleas mean that more and more the focus is turning back to central planning in the provision of care and further away from so-called market mechanisms. The NHS "managed" market has been imperfect and will continue to be so. Argues that there is no alternative but to return to the planned provision of health care in order to improve on accountability and control in the NHS. Hopefully the adverse impact of the market on clinicians and others will force a more rational reappraisal of the fundamental raison d'être of the NHS and the need for those involved in the delivery of services, at all levels, to be more openly accountable.
Ethics, economics, and public financing of health care
Hurley, J.
2001-01-01
There is a wide variety of ethical arguments for public financing of health care that share a common structure built on a series of four logically related propositions regarding: (1) the ultimate purpose of a human life or human society; (2) the role of health and its distribution in society in advancing this ultimate purpose; (3) the role of access to or utilisation of health care in maintaining or improving the desired level and distribution of health among members of society, and (4) the role of public financing in ensuring the ethically justified access to and utilisation of health care by members of society. This paper argues that economics has much to contribute to the development of the ethical foundations for publicly financed health care. It focuses in particular on recent economic work to clarify the concepts of access and need and their role in analyses of the just distribution of health care resources, and on the importance of economic analysis of health care and health care insurance markets in demonstrating why public financing is necessary to achieve broad access to and utilisation of health care services. Key Words: Ethics • economics • health care financing PMID:11479353
Non-ad-hoc decision rule for the Dempster-Shafer method of evidential reasoning
NASA Astrophysics Data System (ADS)
Cheaito, Ali; Lecours, Michael; Bosse, Eloi
1998-03-01
This paper is concerned with the fusion of identity information through the use of statistical analysis rooted in Dempster-Shafer theory of evidence to provide automatic identification aboard a platform. An identity information process for a baseline Multi-Source Data Fusion (MSDF) system is defined. The MSDF system is applied to information sources which include a number of radars, IFF systems, an ESM system, and a remote track source. We use a comprehensive Platform Data Base (PDB) containing all the possible identity values that the potential target may take, and we use the fuzzy logic strategies which enable the fusion of subjective attribute information from sensor and the PDB to make the derivation of target identity more quickly, more precisely, and with statistically quantifiable measures of confidence. The conventional Dempster-Shafer lacks a formal basis upon which decision can be made in the face of ambiguity. We define a non-ad hoc decision rule based on the expected utility interval for pruning the `unessential' propositions which would otherwise overload the real-time data fusion systems. An example has been selected to demonstrate the implementation of our modified Dempster-Shafer method of evidential reasoning.
Training a molecular automaton to play a game
NASA Astrophysics Data System (ADS)
Pei, Renjun; Matamoros, Elizabeth; Liu, Manhong; Stefanovic, Darko; Stojanovic, Milan N.
2010-11-01
Research at the interface between chemistry and cybernetics has led to reports of `programmable molecules', but what does it mean to say `we programmed a set of solution-phase molecules to do X'? A survey of recently implemented solution-phase circuitry indicates that this statement could be replaced with `we pre-mixed a set of molecules to do X and functional subsets of X'. These hard-wired mixtures are then exposed to a set of molecular inputs, which can be interpreted as being keyed to human moves in a game, or as assertions of logical propositions. In nucleic acids-based systems, stemming from DNA computation, these inputs can be seen as generic oligonucleotides. Here, we report using reconfigurable nucleic acid catalyst-based units to build a multipurpose reprogrammable molecular automaton that goes beyond single-purpose `hard-wired' molecular automata. The automaton covers all possible responses to two consecutive sets of four inputs (such as four first and four second moves for a generic set of trivial two-player two-move games). This is a model system for more general molecular field programmable gate array (FPGA)-like devices that can be programmed by example, which means that the operator need not have any knowledge of molecular computing methods.
Two Theories Are Better Than One
NASA Astrophysics Data System (ADS)
Jones, Robert
2008-03-01
All knowledge is of an approximate character (B. Russell, Human Knowledge, 1948, pg 497 and 507). Our formalisms abstract, idealize, and simplify (R. L. Epstein, Propositional Logics, 2001, Ch XI and E. Bender, An Intro. to Math. Modeling, 1978, pg v and 2). Each formalism is an idealization, often times approximating in its own DIFFERENT ways, each offering somewhat different coverage of the domain. Having MULTIPLE overlaping theories of a knowledge domain is then better than having just one theory (R. Jones, APS general meeting, April 2004). Theories are not unique (T. M. Mitchell, Machine Learning, 1997, pg 65-66 and Cooper, Machine Learning, vol. 9, 1992, pg 319). In the future every field will possess multiple theories of its domain and scientific work and engineering will be performed based on the ensemble predictions of ALL of these. In some cases the theories may be quite divergent, differing greatly one from the other. This idea can be considered an extension of Bohr's notion of complementarity, ``...different experimental arrangements...described by different physical concepts...together and only together exhaust the definable information we can obtain about the object.'' (H. J. Folse, The Philosophy of Neils Bohr, 1985, pg 238)
Training a molecular automaton to play a game.
Pei, Renjun; Matamoros, Elizabeth; Liu, Manhong; Stefanovic, Darko; Stojanovic, Milan N
2010-11-01
Research at the interface between chemistry and cybernetics has led to reports of 'programmable molecules', but what does it mean to say 'we programmed a set of solution-phase molecules to do X'? A survey of recently implemented solution-phase circuitry indicates that this statement could be replaced with 'we pre-mixed a set of molecules to do X and functional subsets of X'. These hard-wired mixtures are then exposed to a set of molecular inputs, which can be interpreted as being keyed to human moves in a game, or as assertions of logical propositions. In nucleic acids-based systems, stemming from DNA computation, these inputs can be seen as generic oligonucleotides. Here, we report using reconfigurable nucleic acid catalyst-based units to build a multipurpose reprogrammable molecular automaton that goes beyond single-purpose 'hard-wired' molecular automata. The automaton covers all possible responses to two consecutive sets of four inputs (such as four first and four second moves for a generic set of trivial two-player two-move games). This is a model system for more general molecular field programmable gate array (FPGA)-like devices that can be programmed by example, which means that the operator need not have any knowledge of molecular computing methods.
NASA Technical Reports Server (NTRS)
Folta, David; Young, Corissa; Ross, Adam
2001-01-01
The purpose of this investigation is to determine the feasibility of attaining and maintaining unique non-Keplerian orbit vantage locations in the Earth/Moon environment in order to obtain continuous scientific measurements. The principal difficulty associated with obtaining continuous measurements is the temporal nature of astrodynamics, i.e., classical orbits. This investigation demonstrates advanced trajectory designs to meet demanding science requirements which cannot be met following traditional orbital mechanic logic. Examples of continuous observer missions addressed include Earth pole-sitters and unique vertical libration orbits that address Sun-Earth Connection and Earth Science Vision roadmaps.
Models of dyadic social interaction.
Griffin, Dale; Gonzalez, Richard
2003-01-01
We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382
From conditional oughts to qualitative decision theory
NASA Technical Reports Server (NTRS)
Pearl, Judea
1994-01-01
The primary theme of this investigation is a decision theoretic account of conditional ought statements (e.g., 'You ought to do A, if C') that rectifies glaring deficiencies in classical deontic logic. The resulting account forms a sound basis for qualitative decision theory, thus providing a framework for qualitative planning under uncertainty. In particular, we show that adding causal relationships (in the form of a single graph) as part of an epistemic state is sufficient to facilitate the analysis of action sequences, their consequences, their interaction with observations, their expected utilities, and the synthesis of plans and strategies under uncertainty.
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
Simple analytical model of a thermal diode
NASA Astrophysics Data System (ADS)
Kaushik, Saurabh; Kaushik, Sachin; Marathe, Rahul
2018-05-01
Recently there is a lot of attention given to manipulation of heat by constructing thermal devices such as thermal diodes, transistors and logic gates. Many of the models proposed have an asymmetry which leads to the desired effect. Presence of non-linear interactions among the particles is also essential. But, such models lack analytical understanding. Here we propose a simple, analytically solvable model of a thermal diode. Our model consists of classical spins in contact with multiple heat baths and constant external magnetic fields. Interestingly the magnetic field is the only parameter required to get the effect of heat rectification.
ERIC Educational Resources Information Center
Hayes-Roth, Barbara
Two kinds of memory organization are distinguished: segregrated versus integrated. In segregated memory organizations, related learned propositions have separate memory representations. In integrated memory organizations, memory representations of related propositions share common subrepresentations. Segregated memory organizations facilitate…
The foundation of Piaget's theories: mental and physical action.
Beilin, H; Fireman, G
1999-01-01
Piaget's late theory of action and action implication was the realization of a long history of development. A review of that history shows the central place of action in all of his theoretical assertions, despite the waxing and waning of other important features of his theories. Action was said to be the primary source of knowledge with perception and language in secondary roles. Action is for the most part not only organized but there is logic in action. Action, which is at first physical, becomes internalized and transformed into mental action and mental representation, largely in the development of the symbolic or semiotic function in the sensorimotor period. A number of alternative theories of cognitive development place primary emphasis on mental representation. Piaget provided it with an important place as well, but subordinated it to mental action in the form of operations. In this, as Russell claims, he paralleled Schopenhauer's distinction between representation and will. Piaget's theory of action was intimately related to the gradual development of intentionality in childhood. Intentions were tied to actions by way of the conscious awareness of goals and the means to achieve them. Mental action, following the sensorimotor period, was limited in its logical form to semilogical or one-way functions. These forms were said by Piaget to lack logical reversibility, which was achieved only in the sixth or seventh year, in concrete operations. Mental action was not to be fully realized until the development of formal operations, with hypothetical reasoning, in adolescence, according to the classical Piagetian formulation. This view of the child's logical development, which relied heavily on truth-table (extensional) logic, underwent a number of changes. First from the addition of other logics: category theory and the theory of functions among them. In his last theory, however, an even more radical change occurred. With the collaboration of R. Garcia, he proposed a logic of meanings that would require a recasting of his earlier truth-table-based operatory logic that he claimed explained the development of logical thought and problem solving. The new logic of meanings, influenced by Anderson and Belnap's (1975) logic of entailment, placed new emphasis on inferential processes in the sensorimotor period, introduced protological forms in the actions of the very young child, and proposed that knowledge has an inferential dimension. The consequence was that the late theory shifted emphasis to intentional (qualitative) logic and meaning from the earlier extensional (quantitative) logic and truth testing. The profound changes in Piaget's late theory requires a serious reevaluation of Piaget's entire corpus of research and theory; a task which is yet to be done. Seen in a new light, the late theory is much closer to intellectual currents associated with hermeneutic and semiotic traditions in their concern with meaning and interpretation and less, if at all, with truth. This, despite Piaget's couching of the new theory in a logical mode. The late theory added significant new elements to the theory of action and action-implication, and suggest that Piaget's, and his collaborator's, new research data, which were interpreted within the new theoretical framework, require corroboration and review. The question as to whether Piaget's assertions are at root metaphorical and lack psychological reality, which has followed his theories from its earliest days, arises as well with the assertions of the late theory. Possibly, even more so, since even a limited historical review of his theories points to a considerable concurrence between changes in the fundamental assumptions of his theories and intellectual currents of the times. In hindsight, Piaget's theories appear as "works in progress," down to his last theory. Yet, even in the end, he charted the direction of possible further progress.
Dynamic Network-Based Epistasis Analysis: Boolean Examples
Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.
2011-01-01
In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and single-path assumption, but also by demonstrating the importance of considering temporal dynamics, and specifically introducing the usefulness of Boolean network models and also reviewing some key properties of network approaches. PMID:22645556
Semantic Information and the Syntax of Propositional Attitude Verbs
ERIC Educational Resources Information Center
White, Aaron S.; Hacquard, Valentine; Lidz, Jeffrey
2018-01-01
Propositional attitude verbs, such as "think" and "want," have long held interest for both theoretical linguists and language acquisitionists because their syntactic, semantic, and pragmatic properties display complex interactions that have proven difficult to fully capture from either perspective. This paper explores the…
Intra-Organizational Conflict in Schools.
ERIC Educational Resources Information Center
Wynn, Richard
There is no abundance of research on intra-organizational conflict, and there are no simple answers to the tricky business of managing organizational conflicts. This paper states some propositions about conflict and suggests some management stratagems that can be used in sustaining constructive organizational characteristics. The propositions are…
Value and Performance in the IT Society.
ERIC Educational Resources Information Center
Bryson, Jo
This paper discusses valuing information and its supporting technologies in the global environment. Different value propositions are explored from a financial, social, cultural, political, economic, corporate, and personal values perspective. Various means of measuring the relevancy of these value propositions to the individual, organization or…
Evaluation of Career Development Programs from an Action Perspective.
ERIC Educational Resources Information Center
Young, Richard A.; Valach, Ladislav
1994-01-01
Presents action-theoretical approach to evaluation of career development programs based on constructionist epistemology. Propositions from action-theoretical perspective center around career and action as related, interpretative constructs. Propositions give rise to implications for evaluation of career programs that address ongoing nature of…
Hu, Xiaoqing; Gawronski, Bertram; Balas, Robert
2017-01-01
Evaluative conditioning (EC) is defined as the change in the evaluation of a conditioned stimulus (CS) due to its pairing with a valenced unconditioned stimulus (US). According to propositional accounts, EC effects should be qualified by the relation between the CS and the US. Dual-process accounts suggest that relational information should qualify EC effects on explicit evaluations, whereas implicit evaluations should reflect the frequency of CS-US co-occurrences. Experiments 1 and 2 showed that, when relational information was provided before the encoding of CS-US pairings, it moderated EC effects on explicit, but not implicit, evaluations. In Experiment 3, relational information moderated EC effects on both explicit and implicit evaluations when it was provided simultaneously with CS-US pairings. Frequency of CS-US pairings had no effect on implicit evaluations. Although the results can be reconciled with both propositional and dual-process accounts, they are more parsimoniously explained by propositional accounts.
A Bayesian account of quantum histories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marlow, Thomas
2006-05-15
We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistentmore » with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.« less
Leveraging the real value of laboratory medicine with the value proposition.
Price, Christopher P; John, Andrew St; Christenson, Robert; Scharnhorst, Volker; Oellerich, Michael; Jones, Patricia; Morris, Howard A
2016-11-01
Improving quality and patient safety, containing costs and delivering value-for-money are the key drivers of change in the delivery of healthcare and have stimulated a shift from an activity-based service to a service based on patient-outcomes. The delivery of an outcomes-based healthcare agenda requires that the real value of laboratory medicine to all stakeholders be understood, effectively defined and communicated. The value proposition of any product or service is the link between the provider and the needs of the customer describing the utility of the product or service in terms of benefit to the customer. The framework of a value proposition for laboratory medicine provides the core business case that drives key activities in the evolution and maintenance of high quality healthcare from research through to adoption and quality improvement in an established service. The framework of a value proposition for laboratory medicine is described. The content is endorsed by IFCC and WASPaLM. Copyright © 2016 Elsevier B.V. All rights reserved.
Modified Dempster-Shafer approach using an expected utility interval decision rule
NASA Astrophysics Data System (ADS)
Cheaito, Ali; Lecours, Michael; Bosse, Eloi
1999-03-01
The combination operation of the conventional Dempster- Shafer algorithm has a tendency to increase exponentially the number of propositions involved in bodies of evidence by creating new ones. The aim of this paper is to explore a 'modified Dempster-Shafer' approach of fusing identity declarations emanating form different sources which include a number of radars, IFF and ESM systems in order to limit the explosion of the number of propositions. We use a non-ad hoc decision rule based on the expected utility interval to select the most probable object in a comprehensive Platform Data Base containing all the possible identity values that a potential target may take. We study the effect of the redistribution of the confidence levels of the eliminated propositions which otherwise overload the real-time data fusion system; these eliminated confidence levels can in particular be assigned to ignorance, or uniformly added to the remaining propositions and to ignorance. A scenario has been selected to demonstrate the performance of our modified Dempster-Shafer method of evidential reasoning.
NASA Astrophysics Data System (ADS)
Gyenis, Balázs
2017-02-01
We investigate Maxwell's attempt to justify the mathematical assumptions behind his 1860 Proposition IV according to which the velocity components of colliding particles follow the normal distribution. Contrary to the commonly held view we find that his molecular collision model plays a crucial role in reaching this conclusion, and that his model assumptions also permit inference to equalization of mean kinetic energies (temperatures), which is what he intended to prove in his discredited and widely ignored Proposition VI. If we take a charitable reading of his own proof of Proposition VI then it was Maxwell, and not Boltzmann, who gave the first proof of a tendency towards equilibrium, a sort of H-theorem. We also call attention to a potential conflation of notions of probabilistic and value independence in relevant prior works of his contemporaries and of his own, and argue that this conflation might have impacted his adoption of the suspect independence assumption of Proposition IV.
Efficient Web Services Policy Combination
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Harman, Joseph G.
2010-01-01
Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take precedence. Meta-policies are specified in defeasible logic, a computationally efficient non-monotonic logic developed to model human reasoning. One drawback of this method is that at one point the algorithm starts an exhaustive search of all subsets of the set of conclusions of a defeasible theory. Although the propositional defeasible logic has linear complexity, the set of conclusions here may be large, especially in real-life practical cases. This phenomenon leads to an inefficient exponential explosion of complexity. The current process of getting a Web security policy from combination of two meta-policies consists of two steps. The first is generating a new meta-policy that is a composition of the input meta-policies, and the second is mapping the meta-policy onto a security policy. The new algorithm avoids the exhaustive search in the current algorithm, and provides a security policy that matches all requirements of the involved metapolicies.
The Political Spectacle of Arizona's Proposition 203
ERIC Educational Resources Information Center
Wright, Wayne E.
2005-01-01
Arizona's Proposition 203 places restrictions on bilingual and English-as-a-second-language programs and essentiality mandates English-only education for English language learners (ELLs). This article provides an analysis of this initiative and the wide variations in its interpretation and implementation. Data sources include official policy and…
ERIC Educational Resources Information Center
Benefield, K. Elaine; Capie, William
1976-01-01
A group of students from grades four through twelve were tested on ten binary operations in four truth conditions. It was found that propositional operations which had greater inclusiveness or breadth of concepts were more difficult to comprehend. (MLH)
ERIC Educational Resources Information Center
Massialas, Byron G.
1975-01-01
This paper identifies some important propositions, which issue from fourteen studies of political socialization, points to research gaps, and draws implications for the planning of political education programs in schools. (Author/RK)
MOOCs: Branding, Enrollment, and Multiple Measures of Success
ERIC Educational Resources Information Center
Leeds, Elke M.; Cope, Jim
2015-01-01
KSU redefined the MOOC value proposition through collaboration of university leadership and faculty. The new proposition shifts measures of success beyond just course completion to include measures that benefit students, faculty, and the institution. Students benefitted through access to open educational resources, the acquisition of professional…
Sex Differences, Positive Feedback and Intrinsic Motivation.
ERIC Educational Resources Information Center
Deci, Edward L.; And Others
The paper presents two experiments which test the "change in feelings of competence and self-determination" proposition of cognitive evaluation theory. This proposition states that when a person receives feedback about his performance on an intrinsically motivated activity this information will affect his sense of competence and…
Management Planning and Control: Supporting Knowledge-Intensive Organizations
ERIC Educational Resources Information Center
Herremans, Irene M.; Isaac, Robert G.
2005-01-01
Purpose: The purpose of this paper is to develop propositions for empirical validation regarding appropriate management planning and control systems (MPACS) in knowledge-intensive organizations. Design/methodology/approach: The propositions were developed from interviews with members of a knowledge-intensive virtual organization that is known for…
Constructing Matching Texts in Two Languages: The Application of Propositional Analysis.
ERIC Educational Resources Information Center
Valdes, Guadalupe; And Others
1984-01-01
Discusses how current procedures for selecting/constructing equivalent texts may lead to error because of their specific limitations; proposes the utilization of micro-propositional analysis coupled with word-frequency lists and readability formulas for constructing "matching" texts; presents some procedures which researchers working in…
The HPT Value Proposition in the Larger Improvement Arena.
ERIC Educational Resources Information Center
Wallace, Guy W.
2003-01-01
Discussion of human performance technology (HPT) emphasizes the key variable, which is the human variable. Highlights include the Ishikawa Diagram; human performance as one variable of process performance; collaborating with other improvement approaches; value propositions; and benefits to stakeholders, including real return on investments. (LRW)
Bomble, L; Lavorel, B; Remacle, F; Desouter-Lecomte, M
2008-05-21
Following the scheme recently proposed by Remacle and Levine [Phys. Rev. A 73, 033820 (2006)], we investigate the concrete implementation of a classical full adder on two electronic states (X 1A1 and C 1B2) of the SO2 molecule by optical pump-probe laser pulses using intuitive and counterintuitive (stimulated Raman adiabatic passage) excitation schemes. The resources needed for providing the inputs and reading out are discussed, as well as the conditions for achieving robustness in both the intuitive and counterintuitive pump-dump sequences. The fidelity of the scheme is analyzed with respect to experimental noise and two kinds of perturbations: The coupling to the neighboring rovibrational states and a finite rotational temperature that leads to a mixture for the initial state. It is shown that the logic processing of a full addition cycle can be realistically experimentally implemented on a picosecond time scale while the readout takes a few nanoseconds.
Farber, Barry A
2007-09-01
Carl Rogers' 1957 paper (see record 2007-14639-002) is arguably the most successful of his many attempts to clarify and render testable the ideas behind client-centered therapy. While each of the conditions that Rogers postulated has been linked to positive therapeutic outcome, taken together they have never been conclusively proved (nor disproved) to be either necessary or sufficient for positive outcome. Nevertheless, the overriding "take-home" message in this classic paper--that the therapist's attitude and caring presence is critical for therapeutic success--is one that has had virtually unparalleled influence in every segment of the psychotherapeutic community. Clinical and theoretical innovations in the psychoanalytic community serve as examples of the following proposition: that Rogers' concepts, while accepted more than ever by a remarkably wide variety of psychotherapists, remain essentially unacknowledged as originating with him or in the tradition of humanistic and client-centered therapy. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Automated Proposition Density Analysis for Discourse in Aphasia.
Fromm, Davida; Greenhouse, Joel; Hou, Kaiyue; Russell, G Austin; Cai, Xizhen; Forbes, Margaret; Holland, Audrey; MacWhinney, Brian
2016-10-01
This study evaluates how proposition density can differentiate between persons with aphasia (PWA) and individuals in a control group, as well as among subtypes of aphasia, on the basis of procedural discourse and personal narratives collected from large samples of participants. Participants were 195 PWA and 168 individuals in a control group from the AphasiaBank database. PWA represented 6 aphasia types on the basis of the Western Aphasia Battery-Revised (Kertesz, 2006). Narrative samples were stroke stories for PWA and illness or injury stories for individuals in the control group. Procedural samples were from the peanut-butter-and-jelly-sandwich task. Language samples were transcribed using Codes for the Human Analysis of Transcripts (MacWhinney, 2000) and analyzed using Computerized Language Analysis (MacWhinney, 2000), which automatically computes proposition density (PD) using rules developed for automatic PD measurement by the Computerized Propositional Idea Density Rater program (Brown, Snodgrass, & Covington, 2007; Covington, 2007). Participants in the control group scored significantly higher than PWA on both tasks. PD scores were significantly different among the aphasia types for both tasks. Pairwise comparisons for both discourse tasks revealed that PD scores for the Broca's group were significantly lower than those for all groups except Transcortical Motor. No significant quadratic or linear association between PD and severity was found. Proposition density is differentially sensitive to aphasia type and most clearly differentiates individuals with Broca's aphasia from the other groups.
Family Size, Interaction, Affect and Stress
ERIC Educational Resources Information Center
Nye, F. Ivan; And Others
1970-01-01
Synthesizes previous research on relationship of family size to attitudes. Reduces findings to four propositions and submits these propositions to additional tests utilizing secondary data from two large surveys. Substantively, families of three or four children rank lower in all of the analyses than do families with one or two children. Presented…
The Majority Rule Act. EdSource Election Brief: Proposition 26.
ERIC Educational Resources Information Center
EdSource, Inc., Palo Alto, CA.
This article summarizes "The Majority Rule Act for Smaller Classes, Safer Schools and Financial Accountability" (Proposition 26). The Majority Rule Act deals with the percentage vote that a school district, county office of education, or community college, needs in an election to authorize local general-obligation bonds for school…
Supporting Parents through Parent Education. Building Community Systems for Young Children.
ERIC Educational Resources Information Center
Zepeda, Marlene; Morales, Alex
California's Proposition 10, the "Children and Families Act," has targeted three general areas for improvement in support of families and young children: improved family functioning, improved child development, and improved child health. Proposition 10 views parents as critical to the development of young children. Noting that parent…
Where Does Good Evidence Come from?
ERIC Educational Resources Information Center
Gorard, Stephen; Cook, Thomas
2007-01-01
This article started as a debate between the two authors. Both authors present a series of propositions about quality standards in education research. Cook's propositions, as might be expected, not only concern the importance of experimental trials for establishing the security of causal evidence, but also include some important practical and…
Dismantling Bilingual Education Implementing English Immersion: The California Initiative.
ERIC Educational Resources Information Center
Rossell, Christine H.
This study explored bilingual education in California, analyzing California law on instruction for English Learners before and after Proposition 227. Proposition 227 required that all English Learners (EL) participate in a sheltered English immersion program in which most instruction was in English with curriculum and presentation designed for…
ERIC Educational Resources Information Center
Pike, Gary R.
2006-01-01
Holland's theory of vocational preferences provides a powerful framework for studying students' college experiences. A basic proposition of Holland's theory is that individuals actively seek out and select environments that are congruent with their personality types. Although studies consistently support the self-selection proposition, they have…
Recent Social Movements and Theories of Power in America.
ERIC Educational Resources Information Center
McFarland, Andrew S.
A number of propositions about power in America--taken from the work of Olson, Lowi, McConnell, Schattschneider, and Edelman--are presented and discussed. These propositions comprise an alternative theory to pluralism, which is termed "plural elitism." But neither pluralism nor plural elitism explains the emergence and effects of the…
The Relationship between Mathematical Induction, Proposition Functions, and Implication Functions
ERIC Educational Resources Information Center
Andrew, Lane
2010-01-01
In this study, I explored the relationship between mathematical induction ability and proposition and implication functions through a mixed methods approach. Students from three universities (N = 78) and 6 classrooms completed a written assessment testing their conceptual and procedural capabilities with induction and functions. In addition, I…
Discriminantly Valid Personality Measures: Some Propositions. Research Bulletin No. 339.
ERIC Educational Resources Information Center
Jackson, Douglas N.
Starting with the premise that the construct-oriented approach is the only viable approach to personality assessment, this paper considers five propositions. First, a prerequisite to generalizable and valid psychometric measurement of personality rests on the choice of broad-based constructs with systematic univocal definitions. Next, measures…
Two Propositions on the Application of Point Elasticities to Finite Price Changes.
ERIC Educational Resources Information Center
Daskin, Alan J.
1992-01-01
Considers counterintuitive propositions about using point elasticities to estimate quantity changes in response to price changes. Suggests that elasticity increases with price along a linear demand curve, but falling quantity demand offsets it. Argues that point elasticity with finite percentage change in price only approximates percentage change…
Complex Knowledge Mastery: Some Propositions.
ERIC Educational Resources Information Center
Keller, Joyce A.; Schallert, Diane L.
The proposition that the mastery of complex tasks embodies several components was studied for 236 students in an undergraduate introductory financial accounting course. A new curriculum was developed for the course that included in-depth exposure to the actual financial statements of a company and the understanding of the structural relationships…
Cognitive Integrity Predicts Transitive Inference Performance Bias and Success
ERIC Educational Resources Information Center
Moses, Sandra N.; Villate, Christina; Binns, Malcolm A.; Davidson, Patrick S. R.; Ryan, Jennifer D.
2008-01-01
Transitive inference has traditionally been regarded as a relational proposition-based reasoning task, however, recent investigations question the validity of this assumption. Although some results support the use of a relational proposition-based approach, other studies find evidence for the use of associative learning. We examined whether…
Consensus among Economists--An Update
ERIC Educational Resources Information Center
Fuller, Dan; Geide-Stevenson, Doris
2014-01-01
In this article, the authors explore consensus among economists on specific propositions based on a fall 2011 survey of American Economic Association members. Results are based on 568 responses and provide evidence of changes in opinion over time by including propositions from earlier studies in 2000 (Fuller and Geide-Stevenson 2003) and 1992…
Towards bioelectronic logic (Conference Presentation)
NASA Astrophysics Data System (ADS)
Meredith, Paul; Mostert, Bernard; Sheliakina, Margarita; Carrad, Damon J.; Micolich, Adam P.
2016-09-01
One of the critical tasks in realising a bioelectronic interface is the transduction of ion and electron signals at high fidelity, and with appropriate speed, bandwidth and signal-to-noise ratio [1]. This is a challenging task considering ions and electrons (or holes) have drastically different physics. For example, even the lightest ions (protons) have mobilities much smaller than electrons in the best semiconductors, effective masses are quite different, and at the most basic level, ions are `classical' entities and electrons `quantum mechanical'. These considerations dictate materials and device strategies for bioelectronic interfaces alongside practical aspects such as integration and biocompatibility [2]. In my talk I will detail these `differences in physics' that are pertinent to the ion-electron transduction challenge. From this analysis, I will summarise the basic categories of device architecture that are possibilities for transducing elements and give recent examples of their realisation. Ultimately, transducing elements need to be combined to create `bioelectronic logic' capable of signal processing at the interface level. In this regard, I will extend the discussion past the single element concept, and discuss our recent progress in delivering all-solids-state logic circuits based upon transducing interfaces. [1] "Ion bipolar junction transistors", K. Tybrandt, K.C. Larsson, A. Richter-Dahlfors and M. Berggren, Proc. Natl Acad. Sci., 107, 9929 (2010). [2] "Electronic and optoelectronic materials and devices inspired by nature", P Meredith, C.J. Bettinger, M. Irimia-Vladu, A.B. Mostert and P.E. Schwenn, Reports on Progress in Physics, 76, 034501 (2013).
De Tiège, Alexis; Van de Peer, Yves; Braeckman, Johan; Tanghe, Koen B
2017-11-22
Although classical evolutionary theory, i.e., population genetics and the Modern Synthesis, was already implicitly 'gene-centred', the organism was, in practice, still generally regarded as the individual unit of which a population is composed. The gene-centred approach to evolution only reached a logical conclusion with the advent of the gene-selectionist or gene's eye view in the 1960s and 1970s. Whereas classical evolutionary theory can only work with (genotypically represented) fitness differences between individual organisms, gene-selectionism is capable of working with fitness differences among genes within the same organism and genome. Here, we explore the explanatory potential of 'intra-organismic' and 'intra-genomic' gene-selectionism, i.e., of a behavioural-ecological 'gene's eye view' on genetic, genomic and organismal evolution. First, we give a general outline of the framework and how it complements the-to some extent-still 'organism-centred' approach of classical evolutionary theory. Secondly, we give a more in-depth assessment of its explanatory potential for biological evolution, i.e., for Darwin's 'common descent with modification' or, more specifically, for 'historical continuity or homology with modular evolutionary change' as it has been studied by evolutionary developmental biology (evo-devo) during the last few decades. In contrast with classical evolutionary theory, evo-devo focuses on 'within-organism' developmental processes. Given the capacity of gene-selectionism to adopt an intra-organismal gene's eye view, we outline the relevance of the latter model for evo-devo. Overall, we aim for the conceptual integration between the gene's eye view on the one hand, and more organism-centred evolutionary models (both classical evolutionary theory and evo-devo) on the other.
Analysis of radiology business models.
Enzmann, Dieter R; Schomer, Donald F
2013-03-01
As health care moves to value orientation, radiology's traditional business model faces challenges to adapt. The authors describe a strategic value framework that radiology practices can use to best position themselves in their environments. This simplified construct encourages practices to define their dominant value propositions. There are 3 main value propositions that form a conceptual triangle, whose vertices represent the low-cost provider, the product leader, and the customer intimacy models. Each vertex has been a valid market position, but each demands specific capabilities and trade-offs. The underlying concepts help practices select value propositions they can successfully deliver in their competitive environments. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Schwartz, Seth J.; Montgomery, Marilyn J.; Briones, Ervin
2006-01-01
The present paper advances theoretical propositions regarding the relationship between acculturation and identity. The most central thesis argued is that acculturation represents changes in cultural identity and that personal identity has the potential to "anchor" immigrant people during their transition to a new society. The article emphasizes…
Equal Opportunity in Higher Education: The Past and Future of California's Proposition 209
ERIC Educational Resources Information Center
Grodsky, Eric, Ed.; Kurlaender, Michal, Ed.
2010-01-01
This timely book examines issues pertaining to equal opportunity--affirmative action, challenges to it, and alternatives for improving opportunities for underrepresented groups--in higher education today. Its starting point is California's Proposition 209, which ended race-based affirmative action in public education and the workplace in 1996. The…
Language and Ageing--Exploring Propositional Density in Written Language--Stability over Time
ERIC Educational Resources Information Center
Spencer, Elizabeth; Craig, Hugh; Ferguson, Alison; Colyvas, Kim
2012-01-01
This study investigated the stability of propositional density (PD) in written texts, as this aspect of language shows promise as an indicator and as a predictor of language decline with ageing. This descriptive longitudinal study analysed written texts obtained from the Australian Longitudinal Study of Women's Health in which participants were…
ERIC Educational Resources Information Center
Whaley, Shannon; True, Laurie
The federal government's WIC program, the Special Supplemental Nutrition Program for Women, Infants and Children, is designed to improve the health and development of low-income women and young children. California's passage of Proposition 10, the "Children and Families First Act," has created a climate that encourages collaborative…
Idiosyncratic Deals: Testing Propositions on Timing, Content, and the Employment Relationship
ERIC Educational Resources Information Center
Rousseau, Denise M.; Hornung, Severin; Kim, Tai Gyu
2009-01-01
This study tests propositions regarding idiosyncratic deals (i-deals) in a sample of N = 265 hospital employees using structural equation modeling. Timing and content of idiosyncratic employment arrangements are postulated to have differential consequences for the nature of the employment relationship. Results confirm that i-deals made after hire…
ERIC Educational Resources Information Center
Libby, Roger W.; And Others
1978-01-01
Propositions concerned with reference group and role correlates of Ira Reiss' premarital sexual permissiveness theory were tested. Reiss' basic propositions are only partially supported. Closeness to mother's sexual standards is considerably more predictive of self-permissiveness than was obvious in Reiss' theory. Closeness to friends' and peers'…
Learning the Game of Formulating and Testing Hypotheses and Theories
ERIC Educational Resources Information Center
Maloney, David P.; Masters, Mark F.
2010-01-01
Physics is not immune to questioning by supporters of nonscientific propositions such as "intelligent design" and "creationism." The supporters of these propositions use phrases such as "it's just a theory" to influence those unfamiliar with or even fearful of science, making it increasingly important that all students and in particular science…
The Extended Parallel Process Model: Illuminating the Gaps in Research
ERIC Educational Resources Information Center
Popova, Lucy
2012-01-01
This article examines constructs, propositions, and assumptions of the extended parallel process model (EPPM). Review of the EPPM literature reveals that its theoretical concepts are thoroughly developed, but the theory lacks consistency in operational definitions of some of its constructs. Out of the 12 propositions of the EPPM, a few have not…
ERIC Educational Resources Information Center
Yamagami, Mai
2012-01-01
Using the frameworks of critical discourse analysis, representation theory, and legitimization theory, this study examines the political discourse of the campaign for Proposition 227 in California--particularly, the key social representations of languages, their speakers, and the main political actors in the campaign. The analysis examines the…
Online Concept Maps: Enhancing Collaborative Learning by Using Technology with Concept Maps.
ERIC Educational Resources Information Center
Canas, Alberto J.; Ford, Kenneth M.; Novak, Joseph D.; Hayes, Patrick; Reichherzer, Thomas R.; Suri, Niranjan
2001-01-01
Describes a collaborative software system that allows students from distant schools to share claims derived from their concept maps. Sharing takes place by accessing The Knowledge Soup, a repository of propositions submitted by students and stored on a computer server. Students can use propositions from other students to enhance their concept…
Automated Proposition Density Analysis for Discourse in Aphasia
ERIC Educational Resources Information Center
Fromm, Davida; Greenhouse, Joel; Hou, Kaiyue; Russell, G. Austin; Cai, Xizhen; Forbes, Margaret; Holland, Audrey; MacWhinney, Brian
2016-01-01
Purpose: This study evaluates how proposition density can differentiate between persons with aphasia (PWA) and individuals in a control group, as well as among subtypes of aphasia, on the basis of procedural discourse and personal narratives collected from large samples of participants. Method: Participants were 195 PWA and 168 individuals in a…
Control of Prose Processing via Instructional and Typographical Cues.
ERIC Educational Resources Information Center
Glynn, Shawn M.; Di Vesta, Francis J.
1979-01-01
College students studied text about an imaginary solar system. Two cuing systems were manipulated to induce a single or double set of cues consistent with one or two sets of text propositions, or no target propositions were specified. Cuing systems guided construction and implementation of prose-processing decision criteria. (Author/RD)
Dynamic geometry as a context for exploring conjectures
NASA Astrophysics Data System (ADS)
Wares, Arsalan
2018-01-01
The purpose of this paper is to provide examples of 'non-traditional' proof-related activities that can explored in a dynamic geometry environment by university and high school students of mathematics. These propositions were encountered in the dynamic geometry environment. The author believes that teachers can ask their students to construct proofs for these propositions.
What Are Data? Museum Data Bank Research Report Number 1.
ERIC Educational Resources Information Center
Vance, David
This paper describes the process of automatic extraction of implicit--global--data from explicit information by file inversion and threading. Each datum is the symbolic representation of a proposition, and as such has a number of movable parts corresponding to the ideal elements of the proposition represented; e.g., subject, predicate. A third…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eyer, James M.; Schoenung, Susan M.
2008-02-01
The work documented in this report represents another step in the ongoing investigation of innovative and potentially attractive value propositions for electricity storage by the United States Department of Energy (DOE) and Sandia National Laboratories (SNL) Energy Storage Systems (ESS) Program. This study uses updated cost and performance information for modular energy storage (MES) developed for this study to evaluate four prospective value propositions for MES. The four potentially attractive value propositions are defined by a combination of well-known benefits that are associated with electricity generation, delivery, and use. The value propositions evaluated are: (1) transportable MES for electric utilitymore » transmission and distribution (T&D) equipment upgrade deferral and for improving local power quality, each in alternating years, (2) improving local power quality only, in all years, (3) electric utility T&D deferral in year 1, followed by electricity price arbitrage in following years; plus a generation capacity credit in all years, and (4) electric utility end-user cost management during times when peak and critical peak pricing prevail.« less
Leadership for primary health care research.
Pendleton, David
2012-10-01
Over the last decade, I have put together a new theory of leadership. This paper describes its four propositions, which are consistent with the research literature but which lead to conclusions that are not commonly held and seldom put into practice. The first proposition is a model describing the territory of leadership that is different from either the Leadership Qualities Framework, 2006 or the Medical Leadership Competency Framework, 2010, both of which have been devised specifically for the NHS (National Health Service). The second proposition concerns the ill-advised attempt of individuals to become expert in all aspects of leadership: complete in themselves. The third suggests how personality and capability are related. The fourth embraces and recommends the notion of complementary differences among leaders. As the NHS seeks increasing leadership effectiveness, these propositions may need to be considered and their implications woven into the fabric of NHS leader selection and development. Primary Health Care research, like all fields of collective human endeavour, is eminently in need of sound leadership and the same principles that facilitate sound leadership in other fields is likely to be relevant to research teams.
Thou shalt not take sides: Cognition, Logic and the need for changing how we believe
NASA Astrophysics Data System (ADS)
Martins, Andre
2016-03-01
We believe in many different ways. One very common one is by supporting ideas we like. We label them correct and we act to dismiss doubts about them. We take sides about ideas and theories as if that was the right thing to do. And yet, from a rational point of view, this type of support and belief is not justifiable. The best we can hope when describing the real world, as far as we know today, is to have probabilistic knowledge. In practice, estimating a real probability can be too hard to achieve but that just means we have more uncertainty, not less. There are ideas we defend that define, in our minds, our own identity. And recent experiments have been showing that we stop being able to analyze competently those propositions we hold so dearly. In this paper, I gather the evidence we have about taking sides and present the obvious but unseen conclusion that these facts combined mean that we should actually never believe in anything about the real world, except in a probabilistic way. We must actually never take sides since taking sides compromise our abilities to seek for the most correct description of the world. That means we need to start reformulating the way we debate ideas, from our teaching to our political debates. Here, I will show the logical and experimental basis of this conclusion. I will also show, by presenting new models for the evolution of opinions, that our desire to have something to believe is probably behind the emergence of extremism in debates. And we will see how this problem can even have an impact in the reliability of whole scientific fields. The crisis around p-values is discussed and much better understood under the light of this paper results. Finally, I will debate possible consequences and ideas on how to deal with this problem.
Preservation of propositional speech in a pure anomic: the importance of an abstract vocabulary.
Crutch, Sebastian J; Warrington, Elizabeth K
2003-12-01
We describe a detailed quantitative analysis of the propositional speech of a patient, FAV, who became severely anomic following a left occipito-temporal infarction. FAV showed a selective noun retrieval deficit in naming to confrontation and from verbal description. Nonetheless, his propositional speech was fluent and content-rich. To quantify this observation, three picture description-based tasks were designed to elicit spontaneous speech. These were pictures of professional occupations, real world scenes and stylised object scenes. FAV's performance was compared and contrasted with that of 5 age- and sex-matched control subjects on a number of variables including speech production rate, volume of output, pause frequency and duration, word frequency, word concreteness and diversity of vocabulary used. FAV's propositional speech fell within the range of normal control performance on the majority of measurements of quality, quantity and fluency. Only in the narrative tasks which relied more heavily upon a concrete vocabulary, did FAV become less voluble and resort to summarising the scenes in an manner. This dissociation between virtually intact propositional speech and a severe naming deficit represents the purest case of anomia currently on record. We attribute this dissociation in part to the preservation of his ability to retrieve his abstract word vocabulary. Our account demonstrates that poor performance on standard naming tasks may be indicative of only a narrowly defined word retrieval deficit. However, we also propose the existence of a feedback circuit which guides sentence construction by providing information regarding lexical availability.
Off-line, built-in test techniques for VLSI circuits
NASA Technical Reports Server (NTRS)
Buehler, M. G.; Sievers, M. W.
1982-01-01
It is shown that the use of redundant on-chip circuitry improves the testability of an entire VLSI circuit. In the study described here, five techniques applied to a two-bit ripple carry adder are compared. The techniques considered are self-oscillation, self-comparison, partition, scan path, and built-in logic block observer. It is noted that both classical stuck-at faults and nonclassical faults, such as bridging faults (shorts), stuck-on x faults where x may be 0, 1, or vary between the two, and parasitic flip-flop faults occur in IC structures. To simplify the analysis of the testing techniques, however, a stuck-at fault model is assumed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, John King; Nielsen, Erik; Baczewski, Andrew David
This paper describes our work over the past few years to use tools from quantum chemistry to describe electronic structure of nanoelectronic devices. These devices, dubbed "artificial atoms", comprise a few electrons, con ned by semiconductor heterostructures, impurities, and patterned electrodes, and are of intense interest due to potential applications in quantum information processing, quantum sensing, and extreme-scale classical logic. We detail two approaches we have employed: nite-element and Gaussian basis sets, exploring the interesting complications that arise when techniques that were intended to apply to atomic systems are instead used for artificial, solid-state devices.
Space Shuttle Stiffener Ring Foam Failure Analysis, a Non-Conventional Approach
NASA Technical Reports Server (NTRS)
Howard, Philip M.
2015-01-01
The Space Shuttle Program made use of the excellent properties of rigid polyurethane foam for cryogenic tank insulation and as structural protection on the solid rocket boosters. When foam applications de-bond, classical methods of failure analysis did not provide root cause of the failure of the foam. Realizing that foam is the ideal media to document and preserve its own mode of failure, thin sectioning was seen as a logical approach for foam failure analysis to observe the three dimensional morphology of the foam cells. The cell foam morphology provided a much greater understanding of the failure modes than previously achieved.
Hicks, T; Biedermann, A; de Koeijer, J A; Taroni, F; Champod, C; Evett, I W
2015-12-01
The value of forensic results crucially depends on the propositions and the information under which they are evaluated. For example, if a full single DNA profile for a contemporary marker system matching the profile of Mr A is assessed, given the propositions that the DNA came from Mr A and given it came from an unknown person, the strength of evidence can be overwhelming (e.g., in the order of a billion). In contrast, if we assess the same result given that the DNA came from Mr A and given it came from his twin brother (i.e., a person with the same DNA profile), the strength of evidence will be 1, and therefore neutral, unhelpful and irrelevant(1) to the case at hand. While this understanding is probably uncontroversial and obvious to most, if not all practitioners dealing with DNA evidence, the practical precept of not specifying an alternative source with the same characteristics as the one considered under the first proposition may be much less clear in other circumstances. During discussions with colleagues and trainees, cases have come to our attention where forensic scientists have difficulty with the formulation of propositions. It is particularly common to observe that results (e.g., observations) are included in the propositions, whereas-as argued throughout this note-they should not be. A typical example could be a case where a shoe-mark with a logo and the general pattern characteristics of a Nike Air Jordan shoe is found at the scene of a crime. A Nike Air Jordan shoe is then seized at Mr A's house and control prints of this shoe compared to the mark. The results (e.g., a trace with this general pattern and acquired characteristics corresponding to the sole of Mr A's shoe) are then evaluated given the propositions 'The mark was left by Mr A's Nike Air Jordan shoe-sole' and 'The mark was left by an unknown Nike Air Jordan shoe'. As a consequence, the footwear examiner will not evaluate part of the observations (i.e., the mark presents the general pattern of a Nike Air Jordan) whereas they can be highly informative. Such examples can be found in all forensic disciplines. In this article, we present a few such examples and discuss aspects that will help forensic scientists with the formulation of propositions. In particular, we emphasise on the usefulness of notation to distinguish results that forensic scientists should evaluate from case information that the Court will evaluate. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kosko, Bart
1991-01-01
Mappings between fuzzy cubes are discussed. This level of abstraction provides a surprising and fruitful alternative to the propositional and predicate-calculas reasoning techniques used in expert systems. It allows one to reason with sets instead of propositions. Discussed here are fuzzy and neural function estimators, neural vs. fuzzy representation of structured knowledge, fuzzy vector-matrix multiplication, and fuzzy associative memory (FAM) system architecture.
ERIC Educational Resources Information Center
Levy, Mickey D.
1979-01-01
Reports on a socioeconomic analysis of voter behavior on California's Proposition 13 and compares those results with voting on Proposition 1, a 1973 initiative in which the voters rejected a constitutional amendment that would have limited state taxes and expenditures to a percentage of California's net product. Available from NTA-TIA, 21 East…
ERIC Educational Resources Information Center
Zabel, Jeffrey
2014-01-01
I investigate a possible unintended consequence of Proposition 2½ override behavior--that it led to increased segregation in school districts in Massachusetts. This can occur because richer, low-minority towns tend to have more successful override votes that attract similar households with relatively high demands for public services who can afford…
Floyd F. Myron; Kimberly J. Shinew
1999-01-01
Drawing upon structural theory and social group perspectives, this study examined two propositions developed to explain the relationship between interracial contact and leisure preferences among African Americans and Whites. The first proposition stated that as interracial contact increases, the greater the probability of observing similarity in the leisure...
ERIC Educational Resources Information Center
Revilla, Anita Tijerina; Asato, Jolynn
2002-01-01
Explored the relationship between race and language as related to bilingual students' educational experiences. Used Latino/a critical theory, Asian American legal scholarship, and critical race theory as frameworks to examine the aftermath of California's Proposition 227. Data from teachers and administrators highlighted significant variance in…
In Search of the Next Value Proposition
ERIC Educational Resources Information Center
Huwe, Terence K.
2012-01-01
Although it is pretty easy to find colleagues who will express fatigue or frustration about the constant need for libraries to prove their value proposition, there is also an upside to the exercise of crafting a message that justifies librarians' mission. The catch is that however good their crafted message may be, they must forget about ever…
The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.
ERIC Educational Resources Information Center
Bontis, Nick; Chung, Honsan
2000-01-01
Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…
The Future of Nuclear Archaeology: Reducing Legacy Risks of Weapons Fissile Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Thomas W.; Reid, Bruce D.; Toomey, Christopher M.
2014-01-01
This report describes the value proposition for a "nuclear archeological" technical capability and applications program, targeted at resolving uncertainties regarding fissile materials production and use. At its heart, this proposition is that we can never be sure that all fissile material is adequately secure without a clear idea of what "all" means, and that uncertainty in this matter carries risk. We argue that this proposition is as valid today, under emerging state and possible non-state nuclear threats, as it was in an immediate post-Cold-War context, and describe how nuclear archeological methods can be used to verify fissile materials declarations, ormore » estimate and characterize historical fissile materials production independently of declarations.« less
Domain-specific reasoning: social contracts, cheating, and perspective change.
Gigerenzer, G; Hug, K
1992-05-01
What counts as human rationality: reasoning processes that embody content-independent formal theories, such as propositional logic, or reasoning processes that are well designed for solving important adaptive problems? Most theories of human reasoning have been based on content-independent formal rationality, whereas adaptive reasoning, ecological or evolutionary, has been little explored. We elaborate and test an evolutionary approach. Cosmides' (1989) social contract theory, using the Wason selection task. In the first part, we disentangle the theoretical concept of a "social contract" from that of a "cheater-detection algorithm". We demonstrate that the fact that a rule is perceived as a social contract--or a conditional permission or obligation, as Cheng and Holyoak (1985) proposed--is not sufficient to elicit Cosmides' striking results, which we replicated. The crucial issue is not semantic (the meaning of the rule), but pragmatic: whether a person is cued into the perspective of a party who can be cheated. In the second part, we distinguish between social contracts with bilateral and unilateral cheating options. Perspective change in contracts with bilateral cheating options turns P & not-Q responses into not-P & Q responses. The results strongly support social contract theory, contradict availability theory, and cannot be accounted for by pragmatic reasoning schema theory, which lacks the pragmatic concepts of perspectives and cheating detection.
Neurocognitive inefficacy of the strategy process.
Klein, Harold E; D'Esposito, Mark
2007-11-01
The most widely used (and taught) protocols for strategic analysis-Strengths, Weaknesses, Opportunities, and Threats (SWOT) and Porter's (1980) Five Force Framework for industry analysis-have been found to be insufficient as stimuli for strategy creation or even as a basis for further strategy development. We approach this problem from a neurocognitive perspective. We see profound incompatibilities between the cognitive process-deductive reasoning-channeled into the collective mind of strategists within the formal planning process through its tools of strategic analysis (i.e., rational technologies) and the essentially inductive reasoning process actually needed to address ill-defined, complex strategic situations. Thus, strategic analysis protocols that may appear to be and, indeed, are entirely rational and logical are not interpretable as such at the neuronal substrate level where thinking takes place. The analytical structure (or propositional representation) of these tools results in a mental dead end, the phenomenon known in cognitive psychology as functional fixedness. The difficulty lies with the inability of the brain to make out meaningful (i.e., strategy-provoking) stimuli from the mental images (or depictive representations) generated by strategic analysis tools. We propose decreasing dependence on these tools and conducting further research employing brain imaging technology to explore complex data handling protocols with richer mental representation and greater potential for strategy creation.
Pinker, Steven; Nowak, Martin A.; Lee, James J.
2008-01-01
When people speak, they often insinuate their intent indirectly rather than stating it as a bald proposition. Examples include sexual come-ons, veiled threats, polite requests, and concealed bribes. We propose a three-part theory of indirect speech, based on the idea that human communication involves a mixture of cooperation and conflict. First, indirect requests allow for plausible deniability, in which a cooperative listener can accept the request, but an uncooperative one cannot react adversarially to it. This intuition is supported by a game-theoretic model that predicts the costs and benefits to a speaker of direct and indirect requests. Second, language has two functions: to convey information and to negotiate the type of relationship holding between speaker and hearer (in particular, dominance, communality, or reciprocity). The emotional costs of a mismatch in the assumed relationship type can create a need for plausible deniability and, thereby, select for indirectness even when there are no tangible costs. Third, people perceive language as a digital medium, which allows a sentence to generate common knowledge, to propagate a message with high fidelity, and to serve as a reference point in coordination games. This feature makes an indirect request qualitatively different from a direct one even when the speaker and listener can infer each other's intentions with high confidence. PMID:18199841
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2003-04-01
Why there is Something rather than Nothing? From Pythagoras ("everything is number") to Wheeler ("it from bit") theme of ultimate origin stresses primordiality of Ideal Platonic World (IPW) of mathematics. Even popular "quantum tunnelling out of nothing" can specify "nothing" only as (essentially) IPW. IPW exists everywhere (but nowhere in particular) and logically precedes space, time, matter or any "physics" in any conceivable universe. This leads to propositional conjecture (axiom?) that (meta)physical "Platonic Pressure" of infinitude of numbers acts as engine for self-generation of physical universe directly out of mathematics: cosmogenesis is driven by the very fact of IPW inexhaustibility. While physics in other quantum branches of inflating universe (Megaverse)can be(arbitrary) different from ours, number theory (and rest of IPW)is not (it is unique, absolute, immutable and infinitely resourceful). Let (infinite) totality of microstates ("its") of entire Megaverse form countable set. Since countable sets are hierarchically inexhaustible (Cantor's "fractal branching"), each single "it" still has infinite tail of non-overlapping IPW-based "personal labels". Thus, each "bit" ("it") is infinitely and uniquely resourceful: possible venue of elimination ergodicity basis for eternal return cosmological argument. Physics (in any subuniverse) may be limited only by inherent impossibilities residing in IPW, e.g. insolvability of Continuum Problem may be IPW foundation of quantum indeterminicity.
The Livingstone Model of a Main Propulsion System
NASA Technical Reports Server (NTRS)
Bajwa, Anupa; Sweet, Adam; Korsmeyer, David (Technical Monitor)
2003-01-01
Livingstone is a discrete, propositional logic-based inference engine that has been used for diagnosis of physical systems. We present a component-based model of a Main Propulsion System (MPS) and say how it is used with Livingstone (L2) in order to implement a diagnostic system for integrated vehicle health management (IVHM) for the Propulsion IVHM Technology Experiment (PITEX). We start by discussing the process of conceptualizing such a model. We describe graphical tools that facilitated the generation of the model. The model is composed of components (which map onto physical components), connections between components and constraints. A component is specified by variables, with a set of discrete, qualitative values for each variable in its local nominal and failure modes. For each mode, the model specifies the component's behavior and transitions. We describe the MPS components' nominal and fault modes and associated Livingstone variables and data structures. Given this model, and observed external commands and observations from the system, Livingstone tracks the state of the MPS over discrete time-steps by choosing trajectories that are consistent with observations. We briefly discuss how the compiled model fits into the overall PITEX architecture. Finally we summarize our modeling experience, discuss advantages and disadvantages of our approach, and suggest enhancements to the modeling process.
The Quality Teacher and Education Act in San Francisco: Lessons Learned. Policy Brief 09-2
ERIC Educational Resources Information Center
Hough, Heather J.
2009-01-01
This policy brief reviews the recent experience of the San Francisco Unified School District (SFUSD) with the development and approval of Proposition A. Proposition A (also known as the Quality Teacher and Education Act, or QTEA) included a parcel tax mainly dedicated to increasing teachers' salaries, along with a variety of measures introducing…
ERIC Educational Resources Information Center
Smolík, Filip; Stepankova, Hana; Vyhnálek, Martin; Nikolai, Tomáš; Horáková, Karolína; Matejka, Štepán
2016-01-01
Purpose Propositional density (PD) is a measure of content richness in language production that declines in normal aging and more profoundly in dementia. The present study aimed to develop a PD scoring system for Czech and use it to compare PD in language productions of older people with amnestic mild cognitive impairment (aMCI) and control…
ERIC Educational Resources Information Center
Valliani, Nadia
2015-01-01
In 1996, California voters approved Proposition 209--a ban on the consideration of race in the college admissions process at public universities. This policy brief examines the effects of Proposition 209 at the University of California system by analyzing twenty years' of application, admission, and enrollment data. The brief concludes that in…
Why here and not there: The conditional nature of recreation choice
Roger N. Clark; Kent B. Downing
1985-01-01
This paper reports results of several studies to identify the state of the art and direction of research on how recreationists make choices. Findings from the studies have been combined into a list of propositions; the propositions can be considered hypotheses from which future studies can be developed or the effect of management activities on choices can be evaluated...
The Tantric Proposition in Leadership Education: You Make Me Feel Like a Natural Woman.
ERIC Educational Resources Information Center
Wislocki-Goin, Marsha
This paper argues that leadership in higher education should be open to a female leadership model expressed in an Eastern "Tantric" model. Suggesting that a male leadership model that oppresses and excludes women has been in effect for the past millennium, the proposed Tantric proposition is a step toward a shared model of leadership which will be…
ERIC Educational Resources Information Center
Pirnay-Dummer, Pablo
2015-01-01
A local semantic trace is a certain quasi-propositional structure that can still be reconstructed from written content that is incomplete or does not follow a proper grammar. It can also retrace bits of knowledge from text containing only very few words, making the microstructure of these artifacts of knowledge externalization available for…
What Is yet to Come? Three Propositions on the Future of Educational Research as a Common Good
ERIC Educational Resources Information Center
Decuypere, Mathias
2015-01-01
This paper offers some explorative notes accompanying the issues I addressed in the journal's moot, which took place at the ECER 2014 conference (Porto, September 1-5). The notes that follow are explicitly written through the eyes of an emerging researcher, and offer three propositions regarding the future of educational research. These three…
Propositions Toward the Survival of a Self-Endangered Species: The Fundamental Question.
ERIC Educational Resources Information Center
Brandwein, Paul F.
The author considers the search for a better, pollution-free environment and the political processes by which various groups attempt to influence or control the actions of others in the context. He defines the goal of conservation as a recognition of the interdependence of man and his environment and lists 13 propositions which indicate the urgent…
Predictors of short-term treatment outcomes among California's Proposition 36 participants.
Hser, Yih-Ing; Evans, Elizabeth; Teruya, Cheryl; Huang, David; Anglin, M Douglas
2007-05-01
California's voter-initiated Proposition 36 offers non-violent drug offenders community-based treatment as an alternative to incarceration or probation without treatment. This article reports short-term treatment outcomes subsequent to this major shift in drug policy. Data are from 1104 individuals randomly selected from all Proposition 36 participants assessed for treatment in five California counties during 2004. The overall study sample was 30% female, 51% white, 18% Black, 24% Hispanic, and 7% other racial/ethnic groups. The mean+/-SD age was 37+/-10 years. Counties varied considerably in participant characteristics, treatment service intensity, treatment duration, urine testing, and employment and recidivism outcomes, but not in drug use at 3-month follow-up. Controlling for county, logistic regression analysis showed that drug abstinence was predicted by gender (female), employment at baseline (full or part-time), residential (vs. outpatient) stay, low psychiatric severity, frequent urine testing by treatment facility, and more days in treatment. Recidivism was predicted only by shorter treatment duration. Employment predictors included age (younger), gender (male), baseline employment, and lower psychiatric severity. The study findings support drug testing to monitor abstinence and highlight the need to address employment and psychiatric problems among Proposition 36 participants.
Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice
2008-01-01
Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future.
Evaluating forensic biology results given source level propositions.
Taylor, Duncan; Abarno, Damien; Hicks, Tacha; Champod, Christophe
2016-03-01
The evaluation of forensic evidence can occur at any level within the hierarchy of propositions depending on the question being asked and the amount and type of information that is taken into account within the evaluation. Commonly DNA evidence is reported given propositions that deal with the sub-source level in the hierarchy, which deals only with the possibility that a nominated individual is a source of DNA in a trace (or contributor to the DNA in the case of a mixed DNA trace). We explore the use of information obtained from examinations, presumptive and discriminating tests for body fluids, DNA concentrations and some case circumstances within a Bayesian network in order to provide assistance to the Courts that have to consider propositions at source level. We use a scenario in which the presence of blood is of interest as an exemplar and consider how DNA profiling results and the potential for laboratory error can be taken into account. We finish with examples of how the results of these reports could be presented in court using either numerical values or verbal descriptions of the results. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Fatiha, M.; Rahmat, A.; Solihat, R.
2017-09-01
The delivery of concepts in studying Biology often represented through a diagram to easily makes student understand about Biology material. One way to knowing the students’ understanding about diagram can be seen from causal relationship that is constructed by student in the propositional network representation form. This research reveal the trend of students’ propositional network representation patterns when confronted with convention diagram. This descriptive research involved 32 students at one of senior high school in Bandung. The research data was acquired by worksheet that was filled by diagram and it was developed according on information processing standards. The result of this research revealed three propositional network representation patterns are linear relationship, simple reciprocal relationship, and complex reciprocal relationship. The dominating pattern is linear form that is simply connect some information components in diagram by 59,4% students, the reciprocal relationship form with medium level by 28,1% students while the complex reciprocal relationship by only 3,1% and the rest was students who failed to connect information components by 9,4%. Based on results, most of student only able to connect information components on the picture in linear form and a few student constructing reciprocal relationship between information components on convention diagram.
Quantum Error Correction for Minor Embedded Quantum Annealing
NASA Astrophysics Data System (ADS)
Vinci, Walter; Paz Silva, Gerardo; Mishra, Anurag; Albash, Tameem; Lidar, Daniel
2015-03-01
While quantum annealing can take advantage of the intrinsic robustness of adiabatic dynamics, some form of quantum error correction (QEC) is necessary in order to preserve its advantages over classical computation. Moreover, realistic quantum annealers are subject to a restricted connectivity between qubits. Minor embedding techniques use several physical qubits to represent a single logical qubit with a larger set of interactions, but necessarily introduce new types of errors (whenever the physical qubits corresponding to the same logical qubit disagree). We present a QEC scheme where a minor embedding is used to generate a 8 × 8 × 2 cubic connectivity out of the native one and perform experiments on a D-Wave quantum annealer. Using a combination of optimized encoding and decoding techniques, our scheme enables the D-Wave device to solve minor embedded hard instances at least as well as it would on a native implementation. Our work is a proof-of-concept that minor embedding can be advantageously implemented in order to increase both the robustness and the connectivity of a programmable quantum annealer. Applied in conjunction with decoding techniques, this paves the way toward scalable quantum annealing with applications to hard optimization problems.
Frequency-Dependent Enhancement of Fluid Intelligence Induced by Transcranial Oscillatory Potentials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santarnecchi, Emiliano; Polizzotto, Nicola Riccardo; Godone, Marco
Everyday problem solving requires the ability to go beyond experience by efficiently encoding and manipulating new information, i.e., fluid intelligence (Gf) [1]. Performance in tasks involving Gf, such as logical and abstract reasoning, has been shown to rely on distributed neural networks, with a crucial role played by prefrontal regions [2]. Synchronization of neuronal activity in the gamma band is a ubiquitous phenomenon within the brain; however, no evidence of its causal involvement in cognition exists to date [3]. Here, we show an enhancement of Gf ability in a cognitive task induced by exogenous rhythmic stimulation within the gamma band.more » Imperceptible alternating current [4] delivered through the scalp over the left middle frontal gyrus resulted in a frequency-specific shortening of the time required to find the correct solution in a visuospatial abstract reasoning task classically employed to measure Gf abilities (i.e., Raven’s matrices) [5]. Crucially, gamma-band stimulation (γ-tACS) selectively enhanced performance only on more complex trials involving conditional/logical reasoning. The finding presented here supports a direct involvement of gamma oscillatory activity in the mechanisms underlying higher-order human cognition.« less
Magnetic induction of hyperthermia by a modified self-learning fuzzy temperature controller
NASA Astrophysics Data System (ADS)
Wang, Wei-Cheng; Tai, Cheng-Chi
2017-07-01
The aim of this study involved developing a temperature controller for magnetic induction hyperthermia (MIH). A closed-loop controller was applied to track a reference model to guarantee a desired temperature response. The MIH system generated an alternating magnetic field to heat a high magnetic permeability material. This wireless induction heating had few side effects when it was extensively applied to cancer treatment. The effects of hyperthermia strongly depend on the precise control of temperature. However, during the treatment process, the control performance is degraded due to severe perturbations and parameter variations. In this study, a modified self-learning fuzzy logic controller (SLFLC) with a gain tuning mechanism was implemented to obtain high control performance in a wide range of treatment situations. This implementation was performed by appropriately altering the output scaling factor of a fuzzy inverse model to adjust the control rules. In this study, the proposed SLFLC was compared to the classical self-tuning fuzzy logic controller and fuzzy model reference learning control. Additionally, the proposed SLFLC was verified by conducting in vitro experiments with porcine liver. The experimental results indicated that the proposed controller showed greater robustness and excellent adaptability with respect to the temperature control of the MIH system.
Quantum error correction in crossbar architectures
NASA Astrophysics Data System (ADS)
Helsen, Jonas; Steudtner, Mark; Veldhorst, Menno; Wehner, Stephanie
2018-07-01
A central challenge for the scaling of quantum computing systems is the need to control all qubits in the system without a large overhead. A solution for this problem in classical computing comes in the form of so-called crossbar architectures. Recently we made a proposal for a large-scale quantum processor (Li et al arXiv:1711.03807 (2017)) to be implemented in silicon quantum dots. This system features a crossbar control architecture which limits parallel single-qubit control, but allows the scheme to overcome control scaling issues that form a major hurdle to large-scale quantum computing systems. In this work, we develop a language that makes it possible to easily map quantum circuits to crossbar systems, taking into account their architecture and control limitations. Using this language we show how to map well known quantum error correction codes such as the planar surface and color codes in this limited control setting with only a small overhead in time. We analyze the logical error behavior of this surface code mapping for estimated experimental parameters of the crossbar system and conclude that logical error suppression to a level useful for real quantum computation is feasible.
NASA Astrophysics Data System (ADS)
Chen, Su Shing; Caulfield, H. John
1994-03-01
Adaptive Computing, vs. Classical Computing, is emerging to be a field which is the culmination during the last 40 and more years of various scientific and technological areas, including cybernetics, neural networks, pattern recognition networks, learning machines, selfreproducing automata, genetic algorithms, fuzzy logics, probabilistic logics, chaos, electronics, optics, and quantum devices. This volume of "Critical Reviews on Adaptive Computing: Mathematics, Electronics, and Optics" is intended as a synergistic approach to this emerging field. There are many researchers in these areas working on important results. However, we have not seen a general effort to summarize and synthesize these results in theory as well as implementation. In order to reach a higher level of synergism, we propose Adaptive Computing as the field which comprises of the above mentioned computational paradigms and various realizations. The field should include both the Theory (or Mathematics) and the Implementation. Our emphasis is on the interplay of Theory and Implementation. The interplay, an adaptive process itself, of Theory and Implementation is the only "holistic" way to advance our understanding and realization of brain-like computation. We feel that a theory without implementation has the tendency to become unrealistic and "out-of-touch" with reality, while an implementation without theory runs the risk to be superficial and obsolete.
Further results from PIXE analysis of inks in Galileo's notes on motion
NASA Astrophysics Data System (ADS)
Del Carmine, P.; Giuntini, L.; Hooper, W.; Lucarelli, F.; Mandò, P. A.
1996-06-01
We have recently analysed the inks in some of the folios of Vol. 72 of Manoscritti galileiani, kept at the Biblioteca Nazionale Centrale di Firenze, which contains a collection of loose handwritten sheets containing undated notes, data from experiments and propositions on the problems of motion from different periods of Galileo's life. This paper reports specific results obtained from the analysis of some of these propositions, which allowed to make a contribution to their chronological attribution and therefore to the solution of some historical controversies. Even in the case where the "absolute" chronological attributions could not be made on the basis of comparison with dated documents, the PIXE results provided useful information to deny or confirm the hypothesis that different propositions were written in the same or in different periods.
NASA Astrophysics Data System (ADS)
Böhi, P.; Prevedel, R.; Jennewein, T.; Stefanov, A.; Tiefenbacher, F.; Zeilinger, A.
2007-12-01
In general, quantum computer architectures which are based on the dynamical evolution of quantum states, also require the processing of classical information, obtained by measurements of the actual qubits that make up the computer. This classical processing involves fast, active adaptation of subsequent measurements and real-time error correction (feed-forward), so that quantum gates and algorithms can be executed in a deterministic and hence error-free fashion. This is also true in the linear optical regime, where the quantum information is stored in the polarization state of photons. The adaptation of the photon’s polarization can be achieved in a very fast manner by employing electro-optical modulators, which change the polarization of a trespassing photon upon appliance of a high voltage. In this paper we discuss techniques for implementing fast, active feed-forward at the single photon level and we present their application in the context of photonic quantum computing. This includes the working principles and the characterization of the EOMs as well as a description of the switching logics, both of which allow quantum computation at an unprecedented speed.
NASA Astrophysics Data System (ADS)
Ellerman, David
2014-03-01
In models of QM over finite fields (e.g., Schumacher's ``modal quantum theory'' MQT), one finite field stands out, Z2, since Z2 vectors represent sets. QM (finite-dimensional) mathematics can be transported to sets resulting in quantum mechanics over sets or QM/sets. This gives a full probability calculus (unlike MQT with only zero-one modalities) that leads to a fulsome theory of QM/sets including ``logical'' models of the double-slit experiment, Bell's Theorem, QIT, and QC. In QC over Z2 (where gates are non-singular matrices as in MQT), a simple quantum algorithm (one gate plus one function evaluation) solves the Parity SAT problem (finding the parity of the sum of all values of an n-ary Boolean function). Classically, the Parity SAT problem requires 2n function evaluations in contrast to the one function evaluation required in the quantum algorithm. This is quantum speedup but with all the calculations over Z2 just like classical computing. This shows definitively that the source of quantum speedup is not in the greater power of computing over the complex numbers, and confirms the idea that the source is in superposition.
ERIC Educational Resources Information Center
Leadership Education for Asian Pacifics (LEAP) Asian Pacific American Policy Inst.
Proposition 209 is a statewide constitutional amendment initiative in California, which, if passed in November 1996, will eliminate all statewide affirmative action programs. It is argued that, contrary to its title, this amendment is an extreme and unnecessary measure that will actually undermine further advances in civil rights. There are…
Mi-Hyun Park; Michael Stenstrom; Stephanie Pincetl
2009-01-01
This article evaluates the implementation of Proposition O, a stormwater cleanup measure, in Los Angeles, California. The measure was intended to create new funding to help the city comply with the Total Maximum Daily Load requirements under the federal Clean Water Act. Funding water quality objectives through a bond measure was necessary because the city had...
The Linguistic Discourse Model: Towards a Formal Theory of Discourse Structure.
1986-11-01
storyteller should encode propositions with scope outside the storyworld before propositions with scope exclusively ’internal to the storyworld...recovered from storytelling disorder. -f It is imortant to point out, that as treating all disruptions uniformly as embedded relative to the narrative main...therefore, to a brief presentation of one reasonably pervasive storytelling deviation phenomenon. the True Start analyzed informally elsewhere. [48] [55] 84
Education and Tax Limitations: Evidence from Massachusetts' Proposition 2 1/2.
ERIC Educational Resources Information Center
Ladd, Helen F.; Wilson, Julie Boatright
This paper uses survey data collected during the 2 weeks following the November 4, 1980, election to answer questions concerning how local public education should be funded in the wake of the passing of Proposition 2 1/2, a measure that requires high tax rate cities and towns to reduce property tax levies by at least 15 percent per year until they…
Quantum key distribution without the wavefunction
NASA Astrophysics Data System (ADS)
Niestegge, Gerd
A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.
Java PathExplorer: A Runtime Verification Tool
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2001-01-01
We describe recent work on designing an environment called Java PathExplorer for monitoring the execution of Java programs. This environment facilitates the testing of execution traces against high level specifications, including temporal logic formulae. In addition, it contains algorithms for detecting classical error patterns in concurrent programs, such as deadlocks and data races. An initial prototype of the tool has been applied to the executive module of the planetary Rover K9, developed at NASA Ames. In this paper we describe the background and motivation for the development of this tool, including comments on how it relates to formal methods tools as well as to traditional testing, and we then present the tool itself.
Splash, pop, sizzle: Information processing with phononic computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sklan, Sophia R.
2015-05-15
Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics) have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and themore » device level approach to diodes, transistors, memory, and logic. .« less
Experimental quantum information processing with the Talbot effect
NASA Astrophysics Data System (ADS)
Sawada, K.; Walborn, S. P.
2018-07-01
We report a proof of concept experiment illustrating the implementation of several simple quantum logic gates on D-level quantum systems (quDits) using the Talbot effect. A number of QuDit states are encoded into the transverse profile of a paraxial laser beam using a spatial light modulator. These states are transformed through a diagonal phase element and then free-propagation via the fractional Talbot effect, demonstrating the realization of some well-known single quDit gates in quantum computation. Our classical optics experiment allows us to identify several important technical details, and serves as a first experimental step in performing D-dimensional quantum operations with single photons or other quantum systems using this scheme.
Space Shuttle Stiffener Ring Foam Failure, a Non-Conventional Approach
NASA Technical Reports Server (NTRS)
Howard, Philip M.
2007-01-01
The Space Shuttle makes use of the excellent properties of rigid polyurethane foam for cryogenic tank insulation and as structural protection on the solid rocket boosters. When foam applications debond, classical methods of analysis do not always provide root cause of the failure of the foam. Realizing that foam is the ideal media to document and preserve its own mode of failure, thin sectioning was seen as a logical approach for foam failure analysis. Thin sectioning in two directions, both horizontal and vertical to the application, was chosen to observe the three dimensional morphology of the foam cells. The cell foam morphology provided a much greater understanding of the failure modes than previously achieved.