Sample records for logical starting points

  1. Making Temporal Logic Calculational: A Tool for Unification and Discovery

    NASA Astrophysics Data System (ADS)

    Boute, Raymond

    In temporal logic, calculational proofs beyond simple cases are often seen as challenging. The situation is reversed by making temporal logic calculational, yielding shorter and clearer proofs than traditional ones, and serving as a (mental) tool for unification and discovery. A side-effect of unifying theories is easier access by practicians. The starting point is a simple generic (software tool independent) Functional Temporal Calculus (FTC). Specific temporal logics are then captured via endosemantic functions. This concept reflects tacit conventions throughout mathematics and, once identified, is general and useful. FTC also yields a reasoning style that helps discovering theorems by calculation rather than just proving given facts. This is illustrated by deriving various theorems, most related to liveness issues in TLA+, and finding strengthenings of known results. Educational issues are addressed in passing.

  2. Tech Support

    ERIC Educational Resources Information Center

    Hanstein, Andrea

    2013-01-01

    For nearly as long as students, teachers, and community members have been adopting social media--more than a decade--community college leaders have sought ways to harness the power of online applications to improve fundraising and advocacy efforts. Dedicated Facebook pages and Twitter feeds were among the most logical starting points--if a college…

  3. The utility of human ADME data for prioritizing the evaluation of pharmaceuticals in the environment.

    EPA Science Inventory

    To proceed in the investigation of potential effects of pharmaceuticals in the environment, a cohesive data collection strategy is paramount. Given the lack of data for aquatic species, prioritization seems a logical starting point. Several methods have been put forward, for exam...

  4. Grade School Philosophy: How Come and Where To?

    ERIC Educational Resources Information Center

    Lipman, Matthew

    The inclusion of philosophy as part of the elementary school curriculum is discussed in this paper. A definite trend toward specifically including ethics and logic offers a starting point for a philosophy course as part of the general curriculum or as a separate course of study. The author begins by presenting a general analysis of the recent…

  5. Problem Behaviour at Early Age--Basis for Prediction of Asocial Behaviour

    ERIC Educational Resources Information Center

    Krneta, Dragoljub; Ševic, Aleksandra

    2015-01-01

    This paper analyzes the results of the study of prevalence of problem behaviour of students in primary and secondary schools. The starting point is that it is methodologically and logically justified to look for early forms of problem behaviour of students, because it is likely that adult convicted offenders at an early school age manifested forms…

  6. Learning Computer Programming: Implementing a Fractal in a Turing Machine

    ERIC Educational Resources Information Center

    Pereira, Hernane B. de B.; Zebende, Gilney F.; Moret, Marcelo A.

    2010-01-01

    It is common to start a course on computer programming logic by teaching the algorithm concept from the point of view of natural languages, but in a schematic way. In this sense we note that the students have difficulties in understanding and implementation of the problems proposed by the teacher. The main idea of this paper is to show that the…

  7. Multi-input and binary reproducible, high bandwidth floating point adder in a collective network

    DOEpatents

    Chen, Dong; Eisley, Noel A.; Heidelberger, Philip; Steinmacher-Burow, Burkhard

    2016-11-15

    To add floating point numbers in a parallel computing system, a collective logic device receives the floating point numbers from computing nodes. The collective logic devices converts the floating point numbers to integer numbers. The collective logic device adds the integer numbers and generating a summation of the integer numbers. The collective logic device converts the summation to a floating point number. The collective logic device performs the receiving, the converting the floating point numbers, the adding, the generating and the converting the summation in one pass. One pass indicates that the computing nodes send inputs only once to the collective logic device and receive outputs only once from the collective logic device.

  8. Multi-input and binary reproducible, high bandwidth floating point adder in a collective network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Dong; Eisley, Noel A; Heidelberger, Philip

    To add floating point numbers in a parallel computing system, a collective logic device receives the floating point numbers from computing nodes. The collective logic devices converts the floating point numbers to integer numbers. The collective logic device adds the integer numbers and generating a summation of the integer numbers. The collective logic device converts the summation to a floating point number. The collective logic device performs the receiving, the converting the floating point numbers, the adding, the generating and the converting the summation in one pass. One pass indicates that the computing nodes send inputs only once to themore » collective logic device and receive outputs only once from the collective logic device.« less

  9. Numerical aerodynamic simulation facility preliminary study, volume 2 and appendices

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Data to support results obtained in technology assessment studies are presented. Objectives, starting points, and future study tasks are outlined. Key design issues discussed in appendices include: data allocation, transposition network design, fault tolerance and trustworthiness, logic design, processing element of existing components, number of processors, the host system, alternate data base memory designs, number representation, fast div 521 instruction, architectures, and lockstep array versus synchronizable array machine comparison.

  10. Dynamic Order Algebras as an Axiomatization of Modal and Tense Logics

    NASA Astrophysics Data System (ADS)

    Chajda, Ivan; Paseka, Jan

    2015-12-01

    The aim of the paper is to introduce and describe tense operators in every propositional logic which is axiomatized by means of an algebra whose underlying structure is a bounded poset or even a lattice. We introduce the operators G, H, P and F without regard what propositional connectives the logic includes. For this we use the axiomatization of universal quantifiers as a starting point and we modify these axioms for our reasons. At first, we show that the operators can be recognized as modal operators and we study the pairs ( P, G) as the so-called dynamic order pairs. Further, we get constructions of these operators in the corresponding algebra provided a time frame is given. Moreover, we solve the problem of finding a time frame in the case when the tense operators are given. In particular, any tense algebra is representable in its Dedekind-MacNeille completion. Our approach is fully general, we do not relay on the logic under consideration and hence it is applicable in all the up to now known cases.

  11. The design of digital-adaptive controllers for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Stengel, R. F.; Broussard, J. R.; Berry, P. W.

    1976-01-01

    Design procedures for VTOL automatic control systems have been developed and are presented. Using linear-optimal estimation and control techniques as a starting point, digital-adaptive control laws have been designed for the VALT Research Aircraft, a tandem-rotor helicopter which is equipped for fully automatic flight in terminal area operations. These control laws are designed to interface with velocity-command and attitude-command guidance logic, which could be used in short-haul VTOL operations. Developments reported here include new algorithms for designing non-zero-set-point digital regulators, design procedures for rate-limited systems, and algorithms for dynamic control trim setting.

  12. On Mathematical Proving

    NASA Astrophysics Data System (ADS)

    Stefaneas, Petros; Vandoulakis, Ioannis M.

    2015-12-01

    This paper outlines a logical representation of certain aspects of the process of mathematical proving that are important from the point of view of Artificial Intelligence. Our starting-point is the concept of proof-event or proving, introduced by Goguen, instead of the traditional concept of mathematical proof. The reason behind this choice is that in contrast to the traditional static concept of mathematical proof, proof-events are understood as processes, which enables their use in Artificial Intelligence in such contexts, in which problem-solving procedures and strategies are studied. We represent proof-events as problem-centered spatio-temporal processes by means of the language of the calculus of events, which captures adequately certain temporal aspects of proof-events (i.e. that they have history and form sequences of proof-events evolving in time). Further, we suggest a "loose" semantics for the proof-events, by means of Kolmogorov's calculus of problems. Finally, we expose the intented interpretations for our logical model from the fields of automated theorem-proving and Web-based collective proving.

  13. Starting Circuit For Erasable Programmable Logic Device

    NASA Technical Reports Server (NTRS)

    Cole, Steven W.

    1990-01-01

    Voltage regulator bypassed to supply starting current. Starting or "pullup" circuit supplies large inrush of current required by erasable programmable logic device (EPLD) while being turned on. Operates only during such intervals of high demand for current and has little effect any other time. Performs needed bypass, acting as current-dependent shunt connecting battery or other source of power more nearly directly to EPLD. Input capacitor of regulator removed when starting circuit installed, reducing probability of damage to transistor in event of short circuit in or across load.

  14. Logic-Based Retrieval: Technology for Content-Oriented and Analytical Querying of Patent Data

    NASA Astrophysics Data System (ADS)

    Klampanos, Iraklis Angelos; Wu, Hengzhi; Roelleke, Thomas; Azzam, Hany

    Patent searching is a complex retrieval task. An initial document search is only the starting point of a chain of searches and decisions that need to be made by patent searchers. Keyword-based retrieval is adequate for document searching, but it is not suitable for modelling comprehensive retrieval strategies. DB-like and logical approaches are the state-of-the-art techniques to model strategies, reasoning and decision making. In this paper we present the application of logical retrieval to patent searching. The two grand challenges are expressiveness and scalability, where high degree of expressiveness usually means a loss in scalability. In this paper we report how to maintain scalability while offering the expressiveness of logical retrieval required for solving patent search tasks. We present logical retrieval background, and how to model data-source selection and results' fusion. Moreover, we demonstrate the modelling of a retrieval strategy, a technique by which patent professionals are able to express, store and exchange their strategies and rationales when searching patents or when making decisions. An overview of the architecture and technical details complement the paper, while the evaluation reports preliminary results on how query processing times can be guaranteed, and how quality is affected by trading off responsiveness.

  15. R-189 (C-620) air compressor control logic software documentation. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, K.E.

    1995-06-08

    This relates to FFTF plant air compressors. Purpose of this document is to provide an updated Computer Software Description for the software to be used on R-189 (C-620-C) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started.

  16. [Two traditions in the scientific learning of the world. A case study of creation and reception of quantum mechanics over the period 1925-1927, on the bases of discussion between Werner Heisenberg and Albert Einstein].

    PubMed

    Krajniak, Wiktor

    2014-01-01

    The purpose of this article is the analyses of discussion between Albert Einstein and Werner Heisenberg in the period 1925-1927. Their disputes, relating to the sources of scientific knowledge, its methods and the value of knowledge acquired in this way, are part of the characteristic for the European science discourse between rationalism and empirism. On the basis of some sources and literature on the subject, the epistemological positions of both scholars in the period were reconstructed. This episode, yet poorly known, is a unique example of scientific disputes, whose range covers a broad spectrum of methodological problems associated with the historical development of science. The conducted analysis sheds some light on the source of popularity of logical empirism in the first half of the 20th century. A particular emphasis is placed on the impact of the neopositivist ideas which reflect Heisenberg's research program, being the starting point for the Copenhagen interpretation of quantum mechanics. The main assumption of logical empirism, concerning acquisition of scientific knowledge only by means of empirical procedures and logical analysis of the language of science, in view of the voiced by Einstein arguments, bears little relationship with actual testing practices in the historical aspect of the development of science. The criticism of Heisenberg's program, carried out by Einstein, provided arguments for the main critics of the neopositivist ideal and contributed to the bankruptcy of the idea of logical empirism, thereby starting a period of critical rationalism prosperity, arising from criticism of neopositivism and alluding to Einstein's ideas.

  17. Airstart performance of a digital electronic engine control system on an F100 engine

    NASA Technical Reports Server (NTRS)

    Burcham, F. W., Jr.

    1984-01-01

    The digital electronic engine control (DEEC) system installed on an F100 engine in an F-15 aircraft was tested. The DEEC system incorporates a closed-loop air start feature in which the fuel flow is modulated to achieve the desired rate of compressor acceleration. With this logic the DEEC equipped F100 engine can achieve air starts over a larger envelope. The DEEC air start logic, the test program conducted on the F-15, and its results are described.

  18. Where Are the Logical Errors in the Theory of Big Bang?

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.

  19. How Do Higher-Education Students Use Their Initial Understanding to Deal with Contextual Logic-Based Problems in Discrete Mathematics?

    ERIC Educational Resources Information Center

    Lubis, Asrin; Nasution, Andrea Arifsyah

    2017-01-01

    Mathematical reasoning in logical context has now received much attention in the mathematics curriculum documents of many countries, including Indonesia. In Indonesia, students start formally learning about logic when they pursue to senior-high school. Before, they previously have many experiences to deal with logic, but the earlier assignments do…

  20. Japanese Logic Puzzles and Proof

    ERIC Educational Resources Information Center

    Wanko, Jeffrey J.

    2009-01-01

    An understanding of proof does not start in a high school geometry course. Rather, attention to logical reasoning throughout a student's school experience can help the development of proof readiness. In the spirit of problem solving, the author has begun to use some Japanese logic puzzles other than sudoku to help students develop additional…

  1. What It Is, What It's Not, and What's Related: Exploring Plato's "Meno"

    ERIC Educational Resources Information Center

    Heller, Stephen

    2010-01-01

    Teaching logic typically falls under the areas of argumentation and research, as students are taught the importance of "logos," or logical appeals, in their pursuit of an original point. Cohesive, cogent arguments--devoid of logical fallacy--produce more compelling points, and teachers take great strides in pointing to the problems of…

  2. Protecting quantum information in superconducting circuits

    NASA Astrophysics Data System (ADS)

    Devoret, Michel

    Can we prolong the coherence of a two-state manifold in a complex quantum system beyond the coherence of its longest-lived component? This question is the starting point in the construction of a scalable quantum computer. It translates in the search for processes that operate as some sort of Maxwell's demon and reliably correct the errors resulting from the coupling between qubits and their environment. The presentation will review recent experiments that test the dynamical protection by Josephson circuits of a logical qubit memory based on superpositions of particular coherent states of a superconducting resonator.

  3. Methodology for the systems engineering process. Volume 2: Technical parameters

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A scheme based on starting the logic networks from the development and mission factors that are of primary concern in an aerospace system is described. This approach required identifying the primary states (design, design verification, premission, mission, postmission), identifying the attributes within each state (performance capability, survival, evaluation, operation, etc), and then developing the generic relationships of variables for each branch. To illustrate this concept, a system was used that involved a launch vehicle and payload for an earth orbit mission. Examination showed that this example was sufficient to illustrate the concept. A more complicated mission would follow the same basic approach, but would have more extensive sets of generic trees and more correlation points between branches. It has been shown that in each system state (production, test, and use), a logic could be developed to order and classify the parameters involved in the translation from general requirements to specific requirements for system elements.

  4. The mandibular symphysis as a starting point for the occlusal-level reconstruction of panfacial fractures with bicondylar fractures and interruption of the maxillary and mandibular arches: report of two cases.

    PubMed

    Pau, Mauro; Reinbacher, Knut Ernst; Feichtinger, Matthias; Navysany, Kawe; Kärcher, Hans

    2014-06-01

    Panfacial fractures represent a challenge, even for experienced maxillofacial surgeons, because all references for reconstructing the facial skeleton are missing. Logical reconstructive sequencing based on a clear understanding of the correlation between projection and the widths and lengths of facial subunits should enable the surgeon to achieve correct realignment of the bony framework of the face and to prevent late deformity and functional impairment. Reconstruction is particularly challenging in patients presenting with concomitant fractures at the Le Fort I level and affecting the palate, condyles, and mandibular symphysis. In cases without bony loss and sufficient dentition, we believe that accurate fixation of the mandibular symphysis can represent the starting point of a reconstructive sequence that allows successful reconstruction at the Le Fort I level. Two patients were treated in our department by reconstruction starting in the occlusal area through repair of the mandibular symphysis. Both patients considered the postoperative facial shape and profile to be satisfactory and comparable to the pre-injury situation. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  5. Instantons in Self-Organizing Logic Gates

    NASA Astrophysics Data System (ADS)

    Bearden, Sean R. B.; Manukian, Haik; Traversa, Fabio L.; Di Ventra, Massimiliano

    2018-03-01

    Self-organizing logic is a recently suggested framework that allows the solution of Boolean truth tables "in reverse"; i.e., it is able to satisfy the logical proposition of gates regardless to which terminal(s) the truth value is assigned ("terminal-agnostic logic"). It can be realized if time nonlocality (memory) is present. A practical realization of self-organizing logic gates (SOLGs) can be done by combining circuit elements with and without memory. By employing one such realization, we show, numerically, that SOLGs exploit elementary instantons to reach equilibrium points. Instantons are classical trajectories of the nonlinear equations of motion describing SOLGs and connect topologically distinct critical points in the phase space. By linear analysis at those points, we show that these instantons connect the initial critical point of the dynamics, with at least one unstable direction, directly to the final fixed point. We also show that the memory content of these gates affects only the relaxation time to reach the logically consistent solution. Finally, we demonstrate, by solving the corresponding stochastic differential equations, that, since instantons connect critical points, noise and perturbations may change the instanton trajectory in the phase space but not the initial and final critical points. Therefore, even for extremely large noise levels, the gates self-organize to the correct solution. Our work provides a physical understanding of, and can serve as an inspiration for, models of bidirectional logic gates that are emerging as important tools in physics-inspired, unconventional computing.

  6. Intelligent neural network and fuzzy logic control of industrial and power systems

    NASA Astrophysics Data System (ADS)

    Kuljaca, Ognjen

    The main role played by neural network and fuzzy logic intelligent control algorithms today is to identify and compensate unknown nonlinear system dynamics. There are a number of methods developed, but often the stability analysis of neural network and fuzzy control systems was not provided. This work will meet those problems for the several algorithms. Some more complicated control algorithms included backstepping and adaptive critics will be designed. Nonlinear fuzzy control with nonadaptive fuzzy controllers is also analyzed. An experimental method for determining describing function of SISO fuzzy controller is given. The adaptive neural network tracking controller for an autonomous underwater vehicle is analyzed. A novel stability proof is provided. The implementation of the backstepping neural network controller for the coupled motor drives is described. Analysis and synthesis of adaptive critic neural network control is also provided in the work. Novel tuning laws for the system with action generating neural network and adaptive fuzzy critic are given. Stability proofs are derived for all those control methods. It is shown how these control algorithms and approaches can be used in practical engineering control. Stability proofs are given. Adaptive fuzzy logic control is analyzed. Simulation study is conducted to analyze the behavior of the adaptive fuzzy system on the different environment changes. A novel stability proof for adaptive fuzzy logic systems is given. Also, adaptive elastic fuzzy logic control architecture is described and analyzed. A novel membership function is used for elastic fuzzy logic system. The stability proof is proffered. Adaptive elastic fuzzy logic control is compared with the adaptive nonelastic fuzzy logic control. The work described in this dissertation serves as foundation on which analysis of particular representative industrial systems will be conducted. Also, it gives a good starting point for analysis of learning abilities of adaptive and neural network control systems, as well as for the analysis of the different algorithms such as elastic fuzzy systems.

  7. On Logic and Standards for Structuring Documents

    NASA Astrophysics Data System (ADS)

    Eyers, David M.; Jones, Andrew J. I.; Kimbrough, Steven O.

    The advent of XML has been widely seized upon as an opportunity to develop document representation standards that lend themselves to automated processing. This is a welcome development and much good has come of it. That said, present standardization efforts may be criticized on a number of counts. We explore two issues associated with document XML standardization efforts. We label them (i) the dynamic point and (ii) the logical point. Our dynamic point is that in many cases experience has shown that the search for a final, or even reasonably permanent, document representation standard is futile. The case is especially strong for electronic data interchange (EDI). Our logical point is that formalization into symbolic logic is materially helpful for understanding and designing dynamic document standards.

  8. ART/Ada design project, phase 1. Task 2 report: Detailed design

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    Various issues are studied in the context of the design of an Ada based expert system building tool. Using an existing successful design as a starting point, the impact is analyzed of the Ada language and Ada development methodologies on that design, the Ada system is redesigned, and its performance is analyzed using both complexity-theoretic and empirical techniques. The algorithms specified in the overall design are refined, resolving and documenting any open design issues, identifying each system module, documenting the internal architecture and control logic, and describing the primary data structures involved in the module.

  9. Citation analysis of faculty publication: beyond Science Citation Index and Social Science Citation Index.

    PubMed Central

    Reed, K L

    1995-01-01

    When evaluated for promotion or tenure, faculty members are increasingly judged more on the quality than on the quantity of their scholarly publications. As a result, they want help from librarians in locating all citations to their published works for documentation in their curriculum vitae. Citation analysis using Science Citation Index and Social Science Citation Index provides a logical starting point in measuring quality, but the limitations of these sources leave a void in coverage of citations to an author's work. This article discusses alternative and additional methods of locating citations to published works. PMID:8547915

  10. Citation analysis of faculty publication: beyond Science Citation Index and Social Science Citation Index.

    PubMed

    Reed, K L

    1995-10-01

    When evaluated for promotion or tenure, faculty members are increasingly judged more on the quality than on the quantity of their scholarly publications. As a result, they want help from librarians in locating all citations to their published works for documentation in their curriculum vitae. Citation analysis using Science Citation Index and Social Science Citation Index provides a logical starting point in measuring quality, but the limitations of these sources leave a void in coverage of citations to an author's work. This article discusses alternative and additional methods of locating citations to published works.

  11. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    2000-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.

  12. Physician’ entrepreneurship explained: a case study of intra-organizational dynamics in Dutch hospitals and specialty clinics

    PubMed Central

    2014-01-01

    Background Challenges brought about by developments such as continuing market reforms and budget reductions have strained the relation between managers and physicians in hospitals. By applying neo-institutional theory, we research how intra-organizational dynamics between physicians and managers induce physicians to become entrepreneurs by starting a specialty clinic. In addition, we determine the nature of this change by analyzing the intra-organizational dynamics in both hospitals and clinics. Methods For our research, we interviewed a total of fifteen physicians and eight managers in four hospitals and twelve physicians and seven managers in twelve specialty clinics. Results We found evidence that in becoming entrepreneurs, physicians are influenced by intra-organizational dynamics, including power dependence, interest dissatisfaction, and value commitments, between physicians and managers as well as among physicians’ groups. The precise motivation for starting a new clinic can vary depending on the medical or business logic in which the entrepreneurs are embedded, but also the presence of an entrepreneurial nature or nurture. Finally we found that the entrepreneurial process of starting a specialty clinic is a process of sedimented change or hybridized professionalism in which elements of the business logic are added to the existing logic of medical professionalism, leading to a hybrid logic. Conclusions These findings have implications for policy at both the national and hospital level. Shared ownership and aligned incentives may provide the additional cement in which the developing entrepreneurial values are ‘glued’ to the central medical logic. PMID:24885912

  13. Physician' entrepreneurship explained: a case study of intra-organizational dynamics in Dutch hospitals and specialty clinics.

    PubMed

    Koelewijn, Wout T; de Rover, Matthijs; Ehrenhard, Michel L; van Harten, Wim H

    2014-05-19

    Challenges brought about by developments such as continuing market reforms and budget reductions have strained the relation between managers and physicians in hospitals. By applying neo-institutional theory, we research how intra-organizational dynamics between physicians and managers induce physicians to become entrepreneurs by starting a specialty clinic. In addition, we determine the nature of this change by analyzing the intra-organizational dynamics in both hospitals and clinics. For our research, we interviewed a total of fifteen physicians and eight managers in four hospitals and twelve physicians and seven managers in twelve specialty clinics. We found evidence that in becoming entrepreneurs, physicians are influenced by intra-organizational dynamics, including power dependence, interest dissatisfaction, and value commitments, between physicians and managers as well as among physicians' groups. The precise motivation for starting a new clinic can vary depending on the medical or business logic in which the entrepreneurs are embedded, but also the presence of an entrepreneurial nature or nurture. Finally we found that the entrepreneurial process of starting a specialty clinic is a process of sedimented change or hybridized professionalism in which elements of the business logic are added to the existing logic of medical professionalism, leading to a hybrid logic. These findings have implications for policy at both the national and hospital level. Shared ownership and aligned incentives may provide the additional cement in which the developing entrepreneurial values are 'glued' to the central medical logic.

  14. Fuzzy and neural control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1992-01-01

    Fuzzy logic and neural networks provide new methods for designing control systems. Fuzzy logic controllers do not require a complete analytical model of a dynamic system and can provide knowledge-based heuristic controllers for ill-defined and complex systems. Neural networks can be used for learning control. In this chapter, we discuss hybrid methods using fuzzy logic and neural networks which can start with an approximate control knowledge base and refine it through reinforcement learning.

  15. Current Source Logic Gate

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael J. (Inventor); Prokop, Norman F. (Inventor)

    2017-01-01

    A current source logic gate with depletion mode field effect transistor ("FET") transistors and resistors may include a current source, a current steering switch input stage, and a resistor divider level shifting output stage. The current source may include a transistor and a current source resistor. The current steering switch input stage may include a transistor to steer current to set an output stage bias point depending on an input logic signal state. The resistor divider level shifting output stage may include a first resistor and a second resistor to set the output stage point and produce valid output logic signal states. The transistor of the current steering switch input stage may function as a switch to provide at least two operating points.

  16. NULL Convention Floating Point Multiplier

    PubMed Central

    Ramachandran, Seshasayanan

    2015-01-01

    Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to perform floating point multiplication. The proposed multiplier offers substantial decrease in power consumption when compared with its synchronous version. Performance attributes of the NULL convention logic floating point multiplier, obtained from Xilinx simulation and Cadence, are compared with its equivalent synchronous implementation. PMID:25879069

  17. NULL convention floating point multiplier.

    PubMed

    Albert, Anitha Juliette; Ramachandran, Seshasayanan

    2015-01-01

    Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to perform floating point multiplication. The proposed multiplier offers substantial decrease in power consumption when compared with its synchronous version. Performance attributes of the NULL convention logic floating point multiplier, obtained from Xilinx simulation and Cadence, are compared with its equivalent synchronous implementation.

  18. Wi-Fi location fingerprinting using an intelligent checkpoint sequence

    NASA Astrophysics Data System (ADS)

    Retscher, Günther; Hofer, Hannes

    2017-09-01

    For Wi-Fi positioning location fingerprinting is very common but has the disadvantage that it is very labour consuming for the establishment of a database (DB) with received signal strength (RSS) scans measured on a large number of known reference points (RPs). To overcome this drawback a novel approach is developed which uses a logical sequence of intelligent checkpoints (iCPs) instead of RPs distributed in a regular grid. The iCPs are the selected RPs which have to be passed along the way for navigation from a start point A to the destination B. They are twofold intelligent because of the fact that they depend on their meaningful selection and because of their logical sequence in their correct order. Thus, always the following iCP is known due to a vector graph allocation in the DB and only a small limited number of iCPs needs to be tested when matching the current RSS scans. This reduces the required processing time significantly. It is proven that the iCP approach achieves a higher success rate than conventional approaches. In average correct matching results of 90.0% were achieved using a joint DB including RSS scans of all employed smartphones. An even higher success rate is achieved if the same mobile device is used in both the training and positioning phase.

  19. Constraints on signaling network logic reveal functional subgraphs on Multiple Myeloma OMIC data.

    PubMed

    Miannay, Bertrand; Minvielle, Stéphane; Magrangeas, Florence; Guziolowski, Carito

    2018-03-21

    The integration of gene expression profiles (GEPs) and large-scale biological networks derived from pathways databases is a subject which is being widely explored. Existing methods are based on network distance measures among significantly measured species. Only a small number of them include the directionality and underlying logic existing in biological networks. In this study we approach the GEP-networks integration problem by considering the network logic, however our approach does not require a prior species selection according to their gene expression level. We start by modeling the biological network representing its underlying logic using Logic Programming. This model points to reachable network discrete states that maximize a notion of harmony between the molecular species active or inactive possible states and the directionality of the pathways reactions according to their activator or inhibitor control role. Only then, we confront these network states with the GEP. From this confrontation independent graph components are derived, each of them related to a fixed and optimal assignment of active or inactive states. These components allow us to decompose a large-scale network into subgraphs and their molecular species state assignments have different degrees of similarity when compared to the same GEP. We apply our method to study the set of possible states derived from a subgraph from the NCI-PID Pathway Interaction Database. This graph links Multiple Myeloma (MM) genes to known receptors for this blood cancer. We discover that the NCI-PID MM graph had 15 independent components, and when confronted to 611 MM GEPs, we find 1 component as being more specific to represent the difference between cancer and healthy profiles.

  20. No-Fail Software Gifts for Kids.

    ERIC Educational Resources Information Center

    Buckleitner, Warren

    1996-01-01

    Reviews children's software packages: (1) "Fun 'N Games"--nonviolent games and activities; (2) "Putt-Putt Saves the Zoo"--matching, logic games, and animal facts; (3) "Big Job"--12 logic games with video from job sites; (4) "JumpStart First Grade"--15 activities introducing typical school lessons; and (5) "Read, Write, & Type!"--progressively…

  1. Advancements in Automated Circuit Grouping for Intellectual Property Trust Analysis

    DTIC Science & Technology

    2017-03-20

    operation What had often taken weeks of manual effort has now been reduced to an overnight process or just a matter of hours . This new starting...between the flops and the major macros is added to that hierarchy Rule 4 . Next any flops between hierarchies, or boundary flops, are assigned to a...COMB. LOGIC 4 . Next assign any combinatorial logic between hierarchical blocks, or boundary logic, to a hierarchy using the rule: If-and-only-if

  2. Data cleaning methodology for monthly water-to-oil and water-to-gas production ratios in continuous resource assessments

    USGS Publications Warehouse

    Varela, Brian A.; Haines, Seth S.; Gianoutsos, Nicholas J.

    2017-01-19

    Petroleum production data are usually stored in a format that makes it easy to determine the year and month production started, if there are any breaks, and when production ends. However, in some cases, you may want to compare production runs where the start of production for all wells starts at month one regardless of the year the wells started producing. This report describes the JAVA program the U.S. Geological Survey developed to examine water-to-oil and water-to-gas ratios in the form of month 1, month 2, and so on with the objective of estimating quantities of water and proppant used in low-permeability petroleum production. The text covers the data used by the program, the challenges with production data, the program logic for checking the quality of the production data, and the program logic for checking the completeness of the data.

  3. Track and mode controller (TMC): a software executive for a high-altitude pointing and tracking experiment

    NASA Astrophysics Data System (ADS)

    Michnovicz, Michael R.

    1997-06-01

    A real-time executive has been implemented to control a high altitude pointing and tracking experiment. The track and mode controller (TMC) implements a table driven design, in which the track mode logic for a tracking mission is defined within a state transition diagram (STD). THe STD is implemented as a state transition table in the TMC software. Status Events trigger the state transitions in the STD. Each state, as it is entered, causes a number of processes to be activated within the system. As these processes propagate through the system, the status of key processes are monitored by the TMC, allowing further transitions within the STD. This architecture is implemented in real-time, using the vxWorks operating system. VxWorks message queues allow communication of status events from the Event Monitor task to the STD task. Process commands are propagated to the rest of the system processors by means of the SCRAMNet shared memory network. The system mode logic contained in the STD will autonomously sequence in acquisition, tracking and pointing system through an entire engagement sequence, starting with target detection and ending with aimpoint maintenance. Simulation results and lab test results will be presented to verify the mode controller. In addition to implementing the system mode logic with the STD, the TMC can process prerecorded time sequences of commands required during startup operations. It can also process single commands from the system operator. In this paper, the author presents (1) an overview, in which he describes the TMC architecture, the relationship of an end-to-end simulation to the flight software and the laboratory testing environment, (2) implementation details, including information on the vxWorks message queues and the SCRAMNet shared memory network, (3) simulation results and lab test results which verify the mode controller, and (4) plans for the future, specifically as to how this executive will expedite transition to a fully functional system.

  4. Using Temporal Logic to Specify and Verify Cryptographic Protocols (Progress Report)

    DTIC Science & Technology

    1995-01-01

    know, Meadows’ 1Supported by grant HKUST 608/94E from the Hong Kong Research Grants Council. 1 Report Documentation Page Form ApprovedOMB No. 0704... 1 Introduction We have started work on a project to apply temporal logic to reason about cryptographic protocols. Some of the goals of the project...are as follows. 1 . Allow the user to state and prove that the penetrator cannot use logical or algebraic techniques (e.g., we are disregarding

  5. A new systematic and quantitative approach to characterization of surface nanostructures using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Al-Mousa, Amjed A.

    Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.

  6. Simulation and experiment of a fuzzy logic based MPPT controller for a small wind turbine system

    NASA Astrophysics Data System (ADS)

    Petrila, Diana; Muntean, Nicolae

    2012-09-01

    This paper describes the development of a fuzzy logic based maximum power point tracking (MPPT) strategy for a variable speed wind turbine system (VSWT). For this scope, a fuzzy logic controller (FLC) was described, simulated and tested on a real time "hardware in the loop" wind turbine emulator. Simulation and experimental results show that the controller is able to track the maximum power point for various wind conditions and validate the proposed control strategy.

  7. Critical periods and amblyopia.

    PubMed

    Daw, N W

    1998-04-01

    During the past 20 years, basic science has shown that there are different critical periods for different visual functions during the development of the visual system. Visual functions processed at higher anatomical levels within the system have a later critical period than functions processed at lower levels. This general principle suggests that treatments for amblyopia should be followed in a logical sequence, with treatment for each visual function to be started before its critical period is over. However, critical periods for some visual functions, such as stereopsis, are not yet fully determined, and the optimal treatment is, therefore, unknown. This article summarizes the current extent of our knowledge and points to the gaps that need to be filled.

  8. Realworld maximum power point tracking simulation of PV system based on Fuzzy Logic control

    NASA Astrophysics Data System (ADS)

    Othman, Ahmed M.; El-arini, Mahdi M. M.; Ghitas, Ahmed; Fathy, Ahmed

    2012-12-01

    In the recent years, the solar energy becomes one of the most important alternative sources of electric energy, so it is important to improve the efficiency and reliability of the photovoltaic (PV) systems. Maximum power point tracking (MPPT) plays an important role in photovoltaic power systems because it maximize the power output from a PV system for a given set of conditions, and therefore maximize their array efficiency. This paper presents a maximum power point tracker (MPPT) using Fuzzy Logic theory for a PV system. The work is focused on the well known Perturb and Observe (P&O) algorithm and is compared to a designed fuzzy logic controller (FLC). The simulation work dealing with MPPT controller; a DC/DC Ćuk converter feeding a load is achieved. The results showed that the proposed Fuzzy Logic MPPT in the PV system is valid.

  9. Master Logic Diagram: method for hazard and initiating event identification in process plants.

    PubMed

    Papazoglou, I A; Aneziris, O N

    2003-02-28

    Master Logic Diagram (MLD), a method for identifying events initiating accidents in chemical installations, is presented. MLD is a logic diagram that resembles a fault tree but without the formal mathematical properties of the latter. MLD starts with a Top Event "Loss of Containment" and decomposes it into simpler contributing events. A generic MLD has been developed which may be applied to all chemical installations storing toxic and/or flammable substances. The method is exemplified through its application to an ammonia storage facility.

  10. Dewey's Logic as a Methodological Grounding Point for Practitioner-Based Inquiry

    ERIC Educational Resources Information Center

    Demetrion, George

    2012-01-01

    The purpose of this essay is to draw out key insights from Dewey's important text "Logic: The Theory of Inquiry" to provide theoretical and practical support for the emergent field of teacher research. The specific focal point is the argument in Cochran-Smith and Lytle's "Inside/Outside: Teacher Research and Knowledge" on the significance of…

  11. The autonomy of biological individuals and artificial models.

    PubMed

    Moreno, Alvaro; Etxeberria, Arantza; Umerez, Jon

    2008-02-01

    This paper aims to offer an overview of the meaning of autonomy for biological individuals and artificial models rooted in a specific perspective that pays attention to the historical and structural aspects of its origins and evolution. Taking autopoiesis and the recursivity characteristic of its circular logic as a starting point, we depart from some of its consequences to claim that the theory of autonomy should also take into account historical and structural features. Autonomy should not be considered only in internal or constitutive terms, the largely neglected interactive aspects stemming from it should be equally addressed. Artificial models contribute to get a better understanding of the role of autonomy for life and the varieties of its organization and phenomenological diversity.

  12. Composite Dry Structure Cost Improvement Approach

    NASA Technical Reports Server (NTRS)

    Nettles, Alan; Nettles, Mindy

    2015-01-01

    This effort demonstrates that by focusing only on properties of relevance, composite interstage and shroud structures can be placed on the Space Launch System vehicle that simultaneously reduces cost, improves reliability, and maximizes performance, thus providing the Advanced Development Group with a new methodology of how to utilize composites to reduce weight for composite structures on launch vehicles. Interstage and shroud structures were chosen since both of these structures are simple in configuration and do not experience extreme environments (such as cryogenic or hot gas temperatures) and should represent a good starting point for flying composites on a 'man-rated' vehicle. They are used as an example only. The project involves using polymer matrix composites for launch vehicle structures, and the logic and rationale behind the proposed new methodology.

  13. Teaching Discrete and Programmable Logic Design Techniques Using a Single Laboratory Board

    ERIC Educational Resources Information Center

    Debiec, P.; Byczuk, M.

    2011-01-01

    Programmable logic devices (PLDs) are used at many universities in introductory digital logic laboratories, where kits containing a single high-capacity PLD replace "standard" sets containing breadboards, wires, and small- or medium-scale integration (SSI/MSI) chips. From the pedagogical point of view, two problems arise in these…

  14. Power control of SAFE reactor using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Irvine, Claude

    2002-01-01

    Controlling the 100 kW SAFE (Safe Affordable Fission Engine) reactor consists of design and implementation of a fuzzy logic process control system to regulate dynamic variables related to nuclear system power. The first phase of development concentrates primarily on system power startup and regulation, maintaining core temperature equilibrium, and power profile matching. This paper discusses the experimental work performed in those areas. Nuclear core power from the fuel elements is simulated using resistive heating elements while heat rejection is processed by a series of heat pipes. Both axial and radial nuclear power distributions are determined from neuronic modeling codes. The axial temperature profile of the simulated core is matched to the nuclear power profile by varying the resistance of the heating elements. The SAFE model establishes radial temperature profile equivalence by establishing 32 control zones as the nodal coordinates. Control features also allow for slow warm up, since complete shutoff can occur in the heat pipes if heat-source temperatures drop/rise below a certain minimum value, depending on the specific fluid and gas combination in the heat pipe. The entire system is expected to be self-adaptive, i.e., capable of responding to long-range changes in the space environment. Particular attention in the development of the fuzzy logic algorithm shall ensure that the system process remains at set point, virtually eliminating overshoot on start-up and during in-process disturbances. The controller design will withstand harsh environments and applications where it might come in contact with water, corrosive chemicals, radiation fields, etc. .

  15. Automatic detection of zebra crossings from mobile LiDAR data

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; González-Jorge, H.; Martínez-Sánchez, J.; Díaz-Vilariño, L.; Arias, P.

    2015-07-01

    An algorithm for the automatic detection of zebra crossings from mobile LiDAR data is developed and tested to be applied for road management purposes. The algorithm consists of several subsequent processes starting with road segmentation by performing a curvature analysis for each laser cycle. Then, intensity images are created from the point cloud using rasterization techniques, in order to detect zebra crossing using the Standard Hough Transform and logical constrains. To optimize the results, image processing algorithms are applied to the intensity images from the point cloud. These algorithms include binarization to separate the painting area from the rest of the pavement, median filtering to avoid noisy points, and mathematical morphology to fill the gaps between the pixels in the border of white marks. Once the road marking is detected, its position is calculated. This information is valuable for inventorying purposes of road managers that use Geographic Information Systems. The performance of the algorithm has been evaluated over several mobile LiDAR strips accounting for a total of 30 zebra crossings. That test showed a completeness of 83%. Non-detected marks mainly come from painting deterioration of the zebra crossing or by occlusions in the point cloud produced by other vehicles on the road.

  16. Student voice - Getting motivated.

    PubMed

    Cowen, Emma

    2016-05-09

    ON THE whole, I consider myself a calm, logical person who rarely gets stressed. This stands me in good stead on placement, but it causes problems when it comes to the academic side of being a student. I rarely start work when I should. I have not yet started an essay the night before it is due, but it is unheard of for me to start an essay sooner than two weeks before it is due.

  17. Logical Fallacies in Indonesian EFL Learners' Argumentative Writing: Students' Perspectives

    ERIC Educational Resources Information Center

    El Khoiri, Niamika; Widiati, Utami

    2017-01-01

    In argumentative writing, the presence of logical fallacy, which can be simply defined as error in reasoning, shows either illegitimate arguments or irrelevant points that will undermine the strength of a claim. Despite its significant role in determining the quality of an argument, the topic of logical fallacy has not been widely explored in the…

  18. Enhancing Learning Effectiveness in Digital Design Courses through the Use of Programmable Logic Boards

    ERIC Educational Resources Information Center

    Zhu, Yi; Weng, T.; Cheng, Chung-Kuan

    2009-01-01

    Incorporating programmable logic devices (PLD) in digital design courses has become increasingly popular. The advantages of using PLDs, such as complex programmable logic devices (CPLDs) and field programmable gate arrays (FPGA), have been discussed before. However, previous studies have focused on the experiences from the point of view of the…

  19. Interpretation of IEEE-854 floating-point standard and definition in the HOL system

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.

    1995-01-01

    The ANSI/IEEE Standard 854-1987 for floating-point arithmetic is interpreted by converting the lexical descriptions in the standard into mathematical conditional descriptions organized in tables. The standard is represented in higher-order logic within the framework of the HOL (Higher Order Logic) system. The paper is divided in two parts with the first part the interpretation and the second part the description in HOL.

  20. Quantum Structure in Cognition and the Foundations of Human Reasoning

    NASA Astrophysics Data System (ADS)

    Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas

    2015-12-01

    Traditional cognitive science rests on a foundation of classical logic and probability theory. This foundation has been seriously challenged by several findings in experimental psychology on human decision making. Meanwhile, the formalism of quantum theory has provided an efficient resource for modeling these classically problematical situations. In this paper, we start from our successful quantum-theoretic approach to the modeling of concept combinations to formulate a unifying explanatory hypothesis. In it, human reasoning is the superposition of two processes - a conceptual reasoning, whose nature is emergence of new conceptuality, and a logical reasoning, founded on an algebraic calculus of the logical type. In most cognitive processes however, the former reasoning prevails over the latter. In this perspective, the observed deviations from classical logical reasoning should not be interpreted as biases but, rather, as natural expressions of emergence in its deepest form.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corvellec, Herve, E-mail: herve.corvellec@ism.lu.se; Bramryd, Torleif

    Highlights: Black-Right-Pointing-Pointer Swedish municipally owned waste management companies are active on political, material, technical, and commercial markets. Black-Right-Pointing-Pointer These markets differ in kind and their demands follow different logics. Black-Right-Pointing-Pointer These markets affect the public service, processing, and marketing of Swedish waste management. Black-Right-Pointing-Pointer Articulating these markets is a strategic challenge for Swedish municipally owned waste management. - Abstract: This paper describes how the business model of two leading Swedish municipally owned solid waste management companies exposes them to four different but related markets: a political market in which their legitimacy as an organization is determined; a waste-as-material market thatmore » determines their access to waste as a process input; a technical market in which these companies choose what waste processing technique to use; and a commercial market in which they market their products. Each of these markets has a logic of its own. Managing these logics and articulating the interrelationships between these markets is a key strategic challenge for these companies.« less

  2. Users Guide to Direct Digital Control of Heating, Ventilating, and Air Conditioning Equipment,

    DTIC Science & Technology

    1985-01-01

    cycles, reset, load shedding, chiller optimization , VAV fan synchronization, and optimum start/stop. The prospective buyer of a DDC system should...in Fig- ure 4. Data on setpoints , reset schedules, and event timing, such as that presented in Figure 6, are often even more difficult to find. In con...control logic, setpoint and other data are readily available. Program logic, setpoint and schedule data, and other information stored in a DDC unit

  3. Multi-valued logic gates based on ballistic transport in quantum point contacts.

    PubMed

    Seo, M; Hong, C; Lee, S-Y; Choi, H K; Kim, N; Chung, Y; Umansky, V; Mahalu, D

    2014-01-22

    Multi-valued logic gates, which can handle quaternary numbers as inputs, are developed by exploiting the ballistic transport properties of quantum point contacts in series. The principle of a logic gate that finds the minimum of two quaternary number inputs is demonstrated. The device is scalable to allow multiple inputs, which makes it possible to find the minimum of multiple inputs in a single gate operation. Also, the principle of a half-adder for quaternary number inputs is demonstrated. First, an adder that adds up two quaternary numbers and outputs the sum of inputs is demonstrated. Second, a device to express the sum of the adder into two quaternary digits [Carry (first digit) and Sum (second digit)] is demonstrated. All the logic gates presented in this paper can in principle be extended to allow decimal number inputs with high quality QPCs.

  4. Design of automatic startup and shutdown logic for a Brayton-cycle 2- to 15-kilowatt engine

    NASA Technical Reports Server (NTRS)

    Vrancik, J. E.; Bainbridge, R. C.

    1975-01-01

    The NASA Lewis Research Center is conducting a closed-Brayton-cycle power conversion system technology program in which a complete power system (engine) has been designed and demonstrated. This report discusses the design of automatic startup and shutdown logic circuits as a modification to the control system presently used in this demonstration engine. This modification was primarily intended to make starting the engine as simple and safe as possible and to allow the engine to be run unattended. In the modified configuration the engine is started by turning the control console power on and pushing the start button after preheating the gas loop. No other operator action is required to effect a complete startup. Shutdown, if one is required, is also effected by a simple stop button. The automatic startup and shutdown of the engine have been successfully and purposefully demonstrated more than 50 times at the Lewis Research Center during 10,000 hours of unattended operation. The net effect of this modification is an engine that can be safely started and stopped by relatively untrained personnel. The approach lends itself directly to remote unattended operation.

  5. Practical applications of digital integrated circuits. Part 2: Minimization techniques, code conversion, flip-flops, and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Here, the 7400 line of transistor to transistor logic (TTL) devices is emphasized almost exclusively where hardware is concerned. However, it should be pointed out that the logic theory contained herein applies to all hardware. Binary numbers, simplification of logic circuits, code conversion circuits, basic flip-flop theory, details about series 54/7400, and asynchronous circuits are discussed.

  6. Unimolecular Logic Gate with Classical Input by Single Gold Atoms.

    PubMed

    Skidin, Dmitry; Faizy, Omid; Krüger, Justus; Eisenhut, Frank; Jancarik, Andrej; Nguyen, Khanh-Hung; Cuniberti, Gianaurelio; Gourdon, Andre; Moresco, Francesca; Joachim, Christian

    2018-02-27

    By a combination of solution and on-surface chemistry, we synthesized an asymmetric starphene molecule with two long anthracenyl input branches and a short naphthyl output branch on the Au(111) surface. Starting from this molecule, we could demonstrate the working principle of a single molecule NAND logic gate by selectively contacting single gold atoms by atomic manipulation to the longer branches of the molecule. The logical input "1" ("0") is defined by the interaction (noninteraction) of a gold atom with one of the input branches. The output is measured by scanning tunneling spectroscopy following the shift in energy of the electronic tunneling resonances at the end of the short branch of the molecule.

  7. Logic-Based Models for the Analysis of Cell Signaling Networks†

    PubMed Central

    2010-01-01

    Computational models are increasingly used to analyze the operation of complex biochemical networks, including those involved in cell signaling networks. Here we review recent advances in applying logic-based modeling to mammalian cell biology. Logic-based models represent biomolecular networks in a simple and intuitive manner without describing the detailed biochemistry of each interaction. A brief description of several logic-based modeling methods is followed by six case studies that demonstrate biological questions recently addressed using logic-based models and point to potential advances in model formalisms and training procedures that promise to enhance the utility of logic-based methods for studying the relationship between environmental inputs and phenotypic or signaling state outputs of complex signaling networks. PMID:20225868

  8. Majority-voted logic fail-sense circuit

    NASA Technical Reports Server (NTRS)

    Mclyman, W. T.

    1977-01-01

    Fail-sense circuit has majority-voted logic component which receives three error voltage signals that are sensed at single point by three error amplifiers. If transistor shorts, only one signal is required to operate; if transistor opens, two signals are required.

  9. Relay Protection and Automation Systems Based on Programmable Logic Integrated Circuits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lashin, A. V., E-mail: LashinAV@lhp.ru; Kozyrev, A. V.

    One of the most promising forms of developing the apparatus part of relay protection and automation devices is considered. The advantages of choosing programmable logic integrated circuits to obtain adaptive technological algorithms in power system protection and control systems are pointed out. The technical difficulties in the problems which today stand in the way of using relay protection and automation systems are indicated and a new technology for solving these problems is presented. Particular attention is devoted to the possibility of reconfiguring the logic of these devices, using programmable logic integrated circuits.

  10. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy

    PubMed Central

    Knijnenburg, Theo A.; Klau, Gunnar W.; Iorio, Francesco; Garnett, Mathew J.; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F. A.

    2016-01-01

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present ‘Logic Optimization for Binary Input to Continuous Output’ (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models. PMID:27876821

  11. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy.

    PubMed

    Knijnenburg, Theo A; Klau, Gunnar W; Iorio, Francesco; Garnett, Mathew J; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F A

    2016-11-23

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present 'Logic Optimization for Binary Input to Continuous Output' (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models.

  12. Linear and non-linear systems identification for adaptive control in mechanical applications vibration suppression

    NASA Astrophysics Data System (ADS)

    Cazzulani, Gabriele; Resta, Ferruccio; Ripamonti, Francesco

    2012-04-01

    During the last years, more and more mechanical applications saw the introduction of active control strategies. In particular, the need of improving the performances and/or the system health is very often associated to vibration suppression. This goal can be achieved considering both passive and active solutions. In this sense, many active control strategies have been developed, such as the Independent Modal Space Control (IMSC) or the resonant controllers (PPF, IRC, . . .). In all these cases, in order to tune and optimize the control strategy, the knowledge of the system dynamic behaviour is very important and it can be achieved both considering a numerical model of the system or through an experimental identification process. Anyway, dealing with non-linear or time-varying systems, a tool able to online identify the system parameters becomes a key-point for the control logic synthesis. The aim of the present work is the definition of a real-time technique, based on ARMAX models, that estimates the system parameters starting from the measurements of piezoelectric sensors. These parameters are returned to the control logic, that automatically adapts itself to the system dynamics. The problem is numerically investigated considering a carbon-fiber plate model forced through a piezoelectric patch.

  13. Logic Circuits as a Vehicle for Technological Literacy.

    ERIC Educational Resources Information Center

    Hazeltine, Barrett

    1985-01-01

    Provides basic information on logic circuits, points out that the topic is a good vehicle for developing technological literacy. The subject could be included in such courses as philosophy, computer science, communications, as well as in courses dealing with electronic circuits. (JN)

  14. The Type-2 Fuzzy Logic Controller-Based Maximum Power Point Tracking Algorithm and the Quadratic Boost Converter for Pv System

    NASA Astrophysics Data System (ADS)

    Altin, Necmi

    2018-05-01

    An interval type-2 fuzzy logic controller-based maximum power point tracking algorithm and direct current-direct current (DC-DC) converter topology are proposed for photovoltaic (PV) systems. The proposed maximum power point tracking algorithm is designed based on an interval type-2 fuzzy logic controller that has an ability to handle uncertainties. The change in PV power and the change in PV voltage are determined as inputs of the proposed controller, while the change in duty cycle is determined as the output of the controller. Seven interval type-2 fuzzy sets are determined and used as membership functions for input and output variables. The quadratic boost converter provides high voltage step-up ability without any reduction in performance and stability of the system. The performance of the proposed system is validated through MATLAB/Simulink simulations. It is seen that the proposed system provides high maximum power point tracking speed and accuracy even for fast changing atmospheric conditions and high voltage step-up requirements.

  15. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Distinguishing between evidence and its explanations in the steering of atomic clocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, John M., E-mail: myers@seas.harvard.edu; Hadi Madjid, F., E-mail: gmadjid@aol.com

    2014-11-15

    Quantum theory reflects within itself a separation of evidence from explanations. This separation leads to a known proof that: (1) no wave function can be determined uniquely by evidence, and (2) any chosen wave function requires a guess reaching beyond logic to things unforeseeable. Chosen wave functions are encoded into computer-mediated feedback essential to atomic clocks, including clocks that step computers through their phases of computation and clocks in space vehicles that supply evidence of signal propagation explained by hypotheses of spacetimes with metric tensor fields. The propagation of logical symbols from one computer to another requires a shared rhythm—likemore » a bucket brigade. Here we show how hypothesized metric tensors, dependent on guesswork, take part in the logical synchronization by which clocks are steered in rate and position toward aiming points that satisfy phase constraints, thereby linking the physics of signal propagation with the sharing of logical symbols among computers. Recognizing the dependence of the phasing of symbol arrivals on guesses about signal propagation transports logical synchronization from the engineering of digital communications to a discipline essential to physics. Within this discipline we begin to explore questions invisible under any concept of time that fails to acknowledge unforeseeable events. In particular, variation of spacetime curvature is shown to limit the bit rate of logical communication. - Highlights: • Atomic clocks are steered in frequency toward an aiming point. • The aiming point depends on a chosen wave function. • No evidence alone can determine the wave function. • The unknowability of the wave function has implications for spacetime curvature. • Variability in spacetime curvature limits the bit rate of communications.« less

  17. LOGIC OF CONTROLLED THRESHOLD DEVICES.

    DTIC Science & Technology

    The synthesis of threshold logic circuits from several points of view is presented. The first approach is applicable to resistor-transistor networks...in which the outputs are tied to a common collector resistor. In general, fewer threshold logic gates than NOR gates connected to a common collector...network to realize a specified function such that the failure of any but the output gate can be compensated for by a change in the threshold level (and

  18. Temporal logics meet telerobotics

    NASA Technical Reports Server (NTRS)

    Rutten, Eric; Marce, Lionel

    1989-01-01

    The specificity of telerobotics being the presence of a human operator, decision assistance tools are necessary for the operator, especially in hostile environments. In order to reduce execution hazards due to a degraded ability for quick and efficient recovery of unexpected dangerous situations, it is of importance to have the opportunity, amongst others, to simulate the possible consequences of a plan before its actual execution, in order to detect these problematic situations. Hence the idea of providing the operator with a simulator enabling him to verify the temporal and logical coherence of his plans. Therefore, the power of logical formalisms is used for representation and deduction purposes. Starting from the class of situations that are represented, a STRIPS (the STanford Research Institute Problem Solver)-like formalism and its underlying logic are adapted to the simulation of plans of actions in time. The choice of a temporal logic enables to build a world representation, on which the effects of plans, grouping actions into control structures, will be transcribed by the simulation, resulting in a verdict and information about the plan's coherence.

  19. A Logical Design of a Session Services Control Layer of a Distributed Network Architecture for SPLICE (Stock Point Logistics Integrated Communication Environment).

    DTIC Science & Technology

    1984-06-01

    Eacn stock point is autonomous witn respect to how it implements data processing support, as long as it accommodates the Navy Supply Systems Command...has its own data elements, files, programs , transactions, users, reports, and some have additional hardware. To augment them all and not force redesign... programs are written to request session establishments among them using only logical addressing names (mailboxes) whicn are independent from physical

  20. Analytical design of a parasitic-loading digital speed controller for a 400-hertz turbine driven alternator

    NASA Technical Reports Server (NTRS)

    Ingle, B. D.; Ryan, J. P.

    1972-01-01

    A design for a solid-state parasitic speed controller using digital logic was analyzed. Parasitic speed controllers are used in space power electrical generating systems to control the speed of turbine-driven alternators within specified limits. The analysis included the performance characteristics of the speed controller and the generation of timing functions. The speed controller using digital logic applies step loads to the alternator. The step loads conduct for a full half wave starting at either zero or 180 electrical degrees.

  1. Organizational Politics in Schools: Micro, Macro, and Logics of Action.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.; Mundell, Bryan L.

    1993-01-01

    Develops a framework for analyzing the politics of school organizations, affirming a Weberian perspective as most appropriate. Develops "logic of action" (the implicit relationship between means and goals) as the focal point of organizational politics. Underlines the importance of analyzing interest groups and their strategies. Political…

  2. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  3. UML activity diagram swimlanes in logic controller design

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona

    2015-12-01

    Logic controller behavior can be specified using various techniques, including UML activity diagrams and control Petri nets. Each technique has its advantages and disadvantages. Application of both specification types in one project allows to take benefits from both of them. Additional elements of UML models make it possible to divide a specification into some parts, considered from other point of view (logic controller, user or system). The paper introduces an idea to use UML activity diagrams with swimlanes to increase the understandability of design models.

  4. Simulation comparison of proportional integral derivative and fuzzy logic in controlling AC-DC buck boost converter

    NASA Astrophysics Data System (ADS)

    Faisal, A.; Hasan, S.; Suherman

    2018-03-01

    AC-DC converter is widely used in the commercial industry even for daily purposes. The AC-DC converter is used to convert AC voltage into DC. In order to obtain the desired output voltage, the converter usually has a controllable regulator. This paper discusses buck boost regulator with a power MOSFET as switching component which is adjusted based on the duty cycle of pulse width modulation (PWM). The main problems of the buck boost converter at start up are the high overshoot, the long peak time and rise time. This paper compares the effectiveness of two control techniques: proportional integral derivative (PID) and fuzzy logic control in controlling the buck boost converter through simulations. The results show that the PID is more sensitive to voltage change than fuzzy logic. However, PID generates higher overshoot, long peak time and rise time. On the other hand, fuzzy logic generates no overshoot and shorter rise time.

  5. Control of electrothermal heating during regeneration of activated carbon fiber cloth.

    PubMed

    Johnsen, David L; Mallouk, Kaitlin E; Rood, Mark J

    2011-01-15

    Electrothermal swing adsorption (ESA) of organic gases generated by industrial processes can reduce atmospheric emissions and allow for reuse of recovered product. Desorption energy efficiency can be improved through control of adsorbent heating, allowing for cost-effective separation and concentration of these gases for reuse. ESA experiments with an air stream containing 2000 ppm(v) isobutane and activated carbon fiber cloth (ACFC) were performed to evaluate regeneration energy consumption. Control logic based on temperature feedback achieved select temperature and power profiles during regeneration cycles while maintaining the ACFC's mean regeneration temperature (200 °C). Energy requirements for regeneration were independent of differences in temperature/power oscillations (1186-1237 kJ/mol of isobutane). ACFC was also heated to a ramped set-point, and the average absolute error between the actual and set-point temperatures was small (0.73%), demonstrating stable control as set-point temperatures vary, which is necessary for practical applications (e.g., higher temperatures for higher boiling point gases). Additional logic that increased the maximum power application at lower ACFC temperatures resulted in a 36% decrease in energy consumption. Implementing such control logic improves energy efficiency for separating and concentrating organic gases for post-desorption liquefaction of the organic gas for reuse.

  6. Broadcasting a message in a parallel computer

    DOEpatents

    Berg, Jeremy E [Rochester, MN; Faraj, Ahmad A [Rochester, MN

    2011-08-02

    Methods, systems, and products are disclosed for broadcasting a message in a parallel computer. The parallel computer includes a plurality of compute nodes connected together using a data communications network. The data communications network optimized for point to point data communications and is characterized by at least two dimensions. The compute nodes are organized into at least one operational group of compute nodes for collective parallel operations of the parallel computer. One compute node of the operational group assigned to be a logical root. Broadcasting a message in a parallel computer includes: establishing a Hamiltonian path along all of the compute nodes in at least one plane of the data communications network and in the operational group; and broadcasting, by the logical root to the remaining compute nodes, the logical root's message along the established Hamiltonian path.

  7. The Mayan Activity: A Way of Teaching Multiple Quantifications in Logical Contexts

    ERIC Educational Resources Information Center

    Roh, Kyeong Hah; Lee, Yong Hah

    2011-01-01

    In this article, we suggest an instructional intervention to help students understand statements involving multiple quantifiers in logical contexts. We analyze students' misinterpretations of multiple quantifiers related to the epsilon-N definition of convergence and point out that they result from a lack of understanding of the significance of…

  8. Logical Aspects of Question-Answering by Computer.

    ERIC Educational Resources Information Center

    Kuhns, J. L.

    The problem of computerized question-answering is discussed in this paper from the point of view of certain technical, although elementary, notions of logic. Although the work reported herein has general application to the design of information systems, it is specifically motivated by the RAND Relational Data File. This system, for which a…

  9. Data Recording in Performance Management: Trouble With the Logics

    ERIC Educational Resources Information Center

    Groth Andersson, Signe; Denvall, Verner

    2017-01-01

    In recent years, performance management (PM) has become a buzzword in public sector organizations. Well-functioning PM systems rely on valid performance data, but critics point out that conflicting rationale or logic among professional staff in recording information can undermine the quality of the data. Based on a case study of social service…

  10. The Extra Flask

    NASA Astrophysics Data System (ADS)

    Thielemann, R.

    1981-05-01

    It seemed quite natural and logical for me to want to address such an informed group about some of the many superalloys developed over the past forty years-but then I realized that most, if not all of them are listed with their compositions and properties in the ASM Metals Handbook, so this subject did not seem promising. At this point I started to think about the many new and important developments in processing techniques that made the melting, casting, and forming of the alloys possible. It seemed to me that the new procedures were, in most every instance, just as important as the new compositions themselves. Without the new techniques, many of the higher strength compositions were difficult, if not impossible to produce by existing procedures. Certainly, vacuum induction melting allowed us to melt and cast the titanium-and aluminum-bearing compositions without incurring the usual oxide and nitride inclusions.

  11. Bacterial Artificial Chromosome Clones of Viruses Comprising the Towne Cytomegalovirus Vaccine

    PubMed Central

    Cui, Xiaohong; Adler, Stuart P.; Davison, Andrew J.; Smith, Larry; Habib, EL-Sayed E.; McVoy, Michael A.

    2012-01-01

    Bacterial artificial chromosome (BAC) clones have proven invaluable for genetic manipulation of herpesvirus genomes. BAC cloning can also be useful for capturing representative genomes that comprise a viral stock or mixture. The Towne live attenuated cytomegalovirus vaccine was developed in the 1970s by serial passage in cultured fibroblasts. Although its safety, immunogenicity, and efficacy have been evaluated in nearly a thousand human subjects, the vaccine itself has been little studied. Instead, genetic composition and in vitro growth properties have been inferred from studies of laboratory stocks that may not always accurately represent the viruses that comprise the vaccine. Here we describe the use of BAC cloning to define the genotypic and phenotypic properties of viruses from the Towne vaccine. Given the extensive safety history of the Towne vaccine, these BACs provide a logical starting point for the development of next-generation rationally engineered cytomegalovirus vaccines. PMID:22187535

  12. Early Foundations for Mathematics Learning and Their Relations to Learning Disabilities.

    PubMed

    Geary, David C

    2013-02-01

    Children's quantitative competencies upon entry into school can have lifelong consequences. Children who start behind generally stay behind, and mathematical skills at school completion influence employment prospects and wages in adulthood. I review the current debate over whether early quantitative learning is supported by (a) an inherent system for representing approximate magnitudes, (b) an attentional-control system that enables explicit processing of quantitative symbols, such as Arabic numerals, or (c) the logical problem-solving abilities that facilitate learning of the relations among numerals. Studies of children with mathematical learning disabilities and difficulties have suggested that each of these competencies may be involved, but to different degrees and at different points in the learning process. Clarifying how and when these competencies facilitate early quantitative learning and developing interventions to address their impact on children have the potential to yield substantial benefits for individuals and for society.

  13. Bacterial artificial chromosome clones of viruses comprising the towne cytomegalovirus vaccine.

    PubMed

    Cui, Xiaohong; Adler, Stuart P; Davison, Andrew J; Smith, Larry; Habib, El-Sayed E; McVoy, Michael A

    2012-01-01

    Bacterial artificial chromosome (BAC) clones have proven invaluable for genetic manipulation of herpesvirus genomes. BAC cloning can also be useful for capturing representative genomes that comprise a viral stock or mixture. The Towne live attenuated cytomegalovirus vaccine was developed in the 1970s by serial passage in cultured fibroblasts. Although its safety, immunogenicity, and efficacy have been evaluated in nearly a thousand human subjects, the vaccine itself has been little studied. Instead, genetic composition and in vitro growth properties have been inferred from studies of laboratory stocks that may not always accurately represent the viruses that comprise the vaccine. Here we describe the use of BAC cloning to define the genotypic and phenotypic properties of viruses from the Towne vaccine. Given the extensive safety history of the Towne vaccine, these BACs provide a logical starting point for the development of next-generation rationally engineered cytomegalovirus vaccines.

  14. VSHC -- VAXstation VWS hardcopy

    NASA Astrophysics Data System (ADS)

    Huckle, H. E.; Clayton, C. A.

    VSHC works when a detached process is run at boot time which runs a .EXE file that creates a permanent mailbox and redefines UISPRINT_DESTINATION to that mailbox. The program then goes into an infinite loop which includes a read to that mailbox. When a hardcopy is initiated, sixel graphics commands are sent to UISPRINT_DESTINATION and thus go to the mailbox. The program then reads those graphics commands from the mailbox and interprets them into equivalent Canon commands, using a `State Machine' technique to determine how far it's got, i.e. is it a start of a plot, end of plot, middle of plot, next plot etc. It spools the file of Canon graphics commands thus created (in VSHC_SCRATCH:), to a queue pointed at by the logical name VSHC_QUEUE. UISPRINT_DESTINATION can be mysteriously reset to its default value of CSA0: and so every few minutes an AST timeout occurs to reset UISPRINT_DESTINATION.

  15. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  16. Multiple-Valued Programmable Logic Array Minimization by Simulated Annealing

    DTIC Science & Technology

    1992-02-10

    time is controllable, allowing one to tradeoff time for minimalit ’. It has been incorporated in the HAMLET PLA minimization tool. AcOSSIOn P? DTTC TAB C...specified along the horizontal axis. Each slice represents one temperature. The slice in the very front represents the highest and starting ...rectangle with a pair of adjacent 2’s in between. This func- tion can yield five product terms by a sequence of reshape moves starting from four

  17. Design and construction of a double inversion recombination switch for heritable sequential genetic memory.

    PubMed

    Ham, Timothy S; Lee, Sung K; Keasling, Jay D; Arkin, Adam P

    2008-07-30

    Inversion recombination elements present unique opportunities for computing and information encoding in biological systems. They provide distinct binary states that are encoded into the DNA sequence itself, allowing us to overcome limitations posed by other biological memory or logic gate systems. Further, it is in theory possible to create complex sequential logics by careful positioning of recombinase recognition sites in the sequence. In this work, we describe the design and synthesis of an inversion switch using the fim and hin inversion recombination systems to create a heritable sequential memory switch. We have integrated the two inversion systems in an overlapping manner, creating a switch that can have multiple states. The switch is capable of transitioning from state to state in a manner analogous to a finite state machine, while encoding the state information into DNA. This switch does not require protein expression to maintain its state, and "remembers" its state even upon cell death. We were able to demonstrate transition into three out of the five possible states showing the feasibility of such a switch. We demonstrate that a heritable memory system that encodes its state into DNA is possible, and that inversion recombination system could be a starting point for more complex memory circuits. Although the circuit did not fully behave as expected, we showed that a multi-state, temporal memory is achievable.

  18. Design and Construction of a Double Inversion Recombination Switch for Heritable Sequential Genetic Memory

    PubMed Central

    Ham, Timothy S.; Lee, Sung K.; Keasling, Jay D.; Arkin, Adam P.

    2008-01-01

    Background Inversion recombination elements present unique opportunities for computing and information encoding in biological systems. They provide distinct binary states that are encoded into the DNA sequence itself, allowing us to overcome limitations posed by other biological memory or logic gate systems. Further, it is in theory possible to create complex sequential logics by careful positioning of recombinase recognition sites in the sequence. Methodology/Principal Findings In this work, we describe the design and synthesis of an inversion switch using the fim and hin inversion recombination systems to create a heritable sequential memory switch. We have integrated the two inversion systems in an overlapping manner, creating a switch that can have multiple states. The switch is capable of transitioning from state to state in a manner analogous to a finite state machine, while encoding the state information into DNA. This switch does not require protein expression to maintain its state, and “remembers” its state even upon cell death. We were able to demonstrate transition into three out of the five possible states showing the feasibility of such a switch. Conclusions/Significance We demonstrate that a heritable memory system that encodes its state into DNA is possible, and that inversion recombination system could be a starting point for more complex memory circuits. Although the circuit did not fully behave as expected, we showed that a multi-state, temporal memory is achievable. PMID:18665232

  19. Tissue Feature-Based and Segmented Deformable Image Registration for Improved Modeling of the Shear Movement of the Lungs

    PubMed Central

    Xie, Yaoqin; Chao, Ming; Xing, Lei

    2009-01-01

    Purpose To report a tissue feature-based image registration strategy with explicit inclusion of the differential motions of thoracic structures. Methods and Materials The proposed technique started with auto-identification of a number of corresponding points with distinct tissue features. The tissue feature points were found by using the scale-invariant feature transform (SIFT) method. The control point pairs were then sorted into different “colors” according to the organs they reside and used to model the involved organs individually. A thin-plate spline (TPS) method was used to register a structure characterized by the control points with a given “color”. The proposed technique was applied to study a digital phantom case, three lung and three liver cancer patients. Results For the phantom case, a comparison with the conventional TPS method showed that the registration accuracy was markedly improved when the differential motions of the lung and chest wall were taken into account. On average, the registration error and the standard deviation (SD) of the 15 points against the known ground truth are reduced from 3.0 mm to 0.5 mm and from 1.5 mm to 0.2 mm, respectively, when the new method was used. Similar level of improvement was achieved for the clinical cases. Conclusions The segmented deformable approach provides a natural and logical solution to model the discontinuous organ motions and greatly improves the accuracy and robustness of deformable registration. PMID:19545792

  20. Checkmate: Capturing Gifted Students' Logical Thinking Using Chess.

    ERIC Educational Resources Information Center

    Rifner, Philip J.; Feldhusen, John F.

    1997-01-01

    Describes the use of chess instruction to develop abstract thinking skills and problem solving among gifted students. Offers suggestions for starting school chess programs, teaching and evaluating chess skills, and measuring the success of both student-players and the program in general. (PB)

  1. Trimming the UCERF2 hazard logic tree

    USGS Publications Warehouse

    Porter, Keith A.; Field, Edward H.; Milner, Kevin

    2012-01-01

    The Uniform California Earthquake Rupture Forecast 2 (UCERF2) is a fully time‐dependent earthquake rupture forecast developed with sponsorship of the California Earthquake Authority (Working Group on California Earthquake Probabilities [WGCEP], 2007; Field et al., 2009). UCERF2 contains 480 logic‐tree branches reflecting choices among nine modeling uncertainties in the earthquake rate model shown in Figure 1. For seismic hazard analysis, it is also necessary to choose a ground‐motion‐prediction equation (GMPE) and set its parameters. Choosing among four next‐generation attenuation (NGA) relationships results in a total of 1920 hazard calculations per site. The present work is motivated by a desire to reduce the computational effort involved in a hazard analysis without understating uncertainty. We set out to assess which branching points of the UCERF2 logic tree contribute most to overall uncertainty, and which might be safely ignored (set to only one branch) without significantly biasing results or affecting some useful measure of uncertainty. The trimmed logic tree will have all of the original choices from the branching points that contribute significantly to uncertainty, but only one arbitrarily selected choice from the branching points that do not.

  2. The trend of digital control system design for nuclear power plants in Korea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S. H.; Jung, H. Y.; Yang, C. Y.

    2006-07-01

    Currently there are 20 nuclear power plants (NPPs) in operation, and 6 more units are under construction in Korea. The control systems of those NPPs have also been developed together with the technology advancement. Control systems started with On-Off control using the relay logic, had been evolved into Solid-State logic using TTL ICs, and applied with the micro-processors since the Yonggwang NPP Units 3 and 4 which started its construction in 1989. Multiplexers are also installed at the local plant areas to collect field input and to send output signals while communicating with the controllers located in the system cabinetsmore » near the main control room in order to reduce the field wiring cables. The design of the digital control system technology for the NPPs in Korea has been optimized to maximize the operability as well as the safety through the design, construction, start-up and operation experiences. Both Shin-Kori Units 1 and 2 and Shin-Wolsong Units 1 and 2 NPP projects under construction are being progressed at the same time. Digital Plant Control Systems of these projects have adopted multi-loop controllers, redundant loop configuration, and soft control system for the radwaste system. Programmable Logic Controller (PLC) and Distributed Control System (DCS) are applied with soft control system in Shin-Kori Units 3 and 4. This paper describes the evolvement of control system at the NPPs in Korea and the experience and design improvement through the observation of the latest failure of the digital control system. In addition, design concept and its trend of the digital control system being applied to the NPP in Korea are introduced. (authors)« less

  3. Generating and executing programs for a floating point single instruction multiple data instruction set architecture

    DOEpatents

    Gschwind, Michael K

    2013-04-16

    Mechanisms for generating and executing programs for a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA) are provided. A computer program product comprising a computer recordable medium having a computer readable program recorded thereon is provided. The computer readable program, when executed on a computing device, causes the computing device to receive one or more instructions and execute the one or more instructions using logic in an execution unit of the computing device. The logic implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA), based on data stored in a vector register file of the computing device. The vector register file is configured to store both scalar and floating point values as vectors having a plurality of vector elements.

  4. Programme Costing - A Logical Step Toward Improved Management.

    ERIC Educational Resources Information Center

    McDougall, Ronald N.

    The analysis of costs of university activities from a functional or program point of view, rather than an organizational unit basis, is not only an imperative for the planning and management of universities, but also a logical method of examing the costs of university operations. A task force of the Committee of Finance Officers-Universities of…

  5. Maximum power point tracking algorithm based on sliding mode and fuzzy logic for photovoltaic sources under variable environmental conditions

    NASA Astrophysics Data System (ADS)

    Atik, L.; Petit, P.; Sawicki, J. P.; Ternifi, Z. T.; Bachir, G.; Della, M.; Aillerie, M.

    2017-02-01

    Solar panels have a nonlinear voltage-current characteristic, with a distinct maximum power point (MPP), which depends on the environmental factors, such as temperature and irradiation. In order to continuously harvest maximum power from the solar panels, they have to operate at their MPP despite the inevitable changes in the environment. Various methods for maximum power point tracking (MPPT) were developed and finally implemented in solar power electronic controllers to increase the efficiency in the electricity production originate from renewables. In this paper we compare using Matlab tools Simulink, two different MPP tracking methods, which are, fuzzy logic control (FL) and sliding mode control (SMC), considering their efficiency in solar energy production.

  6. d-Neighborhood system and generalized F-contraction in dislocated metric space.

    PubMed

    Kumari, P Sumati; Zoto, Kastriot; Panthi, Dinesh

    2015-01-01

    This paper, gives an answer for the Question 1.1 posed by Hitzler (Generalized metrics and topology in logic programming semantics, 2001) by means of "Topological aspects of d-metric space with d-neighborhood system". We have investigated the topological aspects of a d-neighborhood system obtained from dislocated metric space (simply d-metric space) which has got useful applications in the semantic analysis of logic programming. Further more we have generalized the notion of F-contraction in the view of d-metric spaces and investigated the uniqueness of fixed point and coincidence point of such mappings.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed amore » new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.« less

  8. A programmable CCD driver circuit for multiphase CCD operation

    NASA Technical Reports Server (NTRS)

    Ewin, Audrey J.; Reed, Kenneth V.

    1989-01-01

    A programmable CCD (charge-coupled device) driver circuit was designed to drive CCDs in multiphased modes. The purpose of the drive electronics is to operate developmental CCD imaging arrays for NASA's tiltable moderate resolution imaging spectrometer (MODIS-T). Five objectives for the driver were considered during its design: (1) the circuit drives CCD electrode voltages between 0 V and +30 V to produce reasonable potential wells, (2) the driving sequence is started with one input signal, (3) the driving sequence is started with one input signal, (4) the circuit allows programming of frame sequences required by arrays of any size, (5) it produces interfacing signals for the CCD and the DTF (detector test facility). Simulation of the driver verified its function with the master clock running up to 10 MHz. This suggests a maximum rate of 400,000 pixels/s. Timing and packaging parameters were verified. The design uses 54 TTL (transistor-transistor logic) chips. Two versions of hardware were fabricated: wirewrap and printed circuit board. Both were verified functionally with a logic analyzer.

  9. How to prepare and deliver a scientific presentation. Teaching course presentation at the 21st European Stroke Conference, Lisboa, May 2012.

    PubMed

    Alexandrov, Andrei V; Hennerici, Michael G

    2013-01-01

    A scientific presentation is a professional way to share your observation, introduce a hypothesis, demonstrate and interpret the results of a study, or summarize what is learned or to be studied on the subject. PRESENTATION METHODS: Commonly, presentations at major conferences include podium (oral, platform), poster or lecture, and if selected one should be prepared to Plan from the start (place integral parts of the presentation in logical sequence); Reduce the amount of text and visual aids to the bare minimum; Elucidate (clarify) methods; Summarize results and key messages; Effectively deliver; Note all shortcomings, and Transform your own and the current thinking of others. We provide tips on how to achieve this. PRESENTATION RESULTS: After disclosing conflicts, if applicable, start with a brief summary of what is known and why it is required to investigate the subject. State the research question or the purpose of the lecture. For original presentations follow a structure: Introduction, Methods, Results, Conclusions. Invest a sufficient amount of time or poster space in describing the study methods. Clearly organize and deliver the results or synopsis of relevant studies. Include absolute numbers and simple statistics before showing advanced analyses. Remember to present one point at a time. Stay focused. Discuss study limitations. In a lecture or a podium talk or when standing by your poster, always think clearly, have a logical plan, gain audience attention, make them interested in your subject, excite their own thinking about the problem, listen to questions and carefully weigh the evidence that would justify the punch-line. Rank scientific evidence in your presentation appropriately. What may seem obvious may turn erroneous or more complex. Rehearse your presentation before you deliver it at a conference. Challenge yourself to dry runs with your most critically thinking colleagues. When the time comes, ace it with a clear mind, precise execution and fund of knowledge. Copyright © 2013 S. Karger AG, Basel.

  10. Verification and Planning Based on Coinductive Logic Programming

    NASA Technical Reports Server (NTRS)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.

  11. An Energy Saving Green Plug Device for Nonlinear Loads

    NASA Astrophysics Data System (ADS)

    Bloul, Albe; Sharaf, Adel; El-Hawary, Mohamed

    2018-03-01

    The paper presents a low cost a FACTS Based flexible fuzzy logic based modulated/switched tuned arm filter and Green Plug compensation (SFC-GP) scheme for single-phase nonlinear loads ensuring both voltage stabilization and efficient energy utilization. The new Green Plug-Switched filter compensator SFC modulated LC-Filter PWM Switched Capacitive Compensation Devices is controlled using a fuzzy logic regulator to enhance power quality, improve power factor at the source and reduce switching transients and inrush current conditions as well harmonic contents in source current. The FACTS based SFC-GP Device is a member of family of Green Plug/Filters/Compensation Schemes used for efficient energy utilization, power quality enhancement and voltage/inrush current/soft starting control using a dynamic error driven fuzzy logic controller (FLC). The device with fuzzy logic controller is validated using the Matlab / Simulink Software Environment for enhanced power quality (PQ), improved power factor and reduced inrush currents. This is achieved using modulated PWM Switching of the Filter-Capacitive compensation scheme to cope with dynamic type nonlinear and inrush cyclical loads..

  12. Fuzzy logic control system to provide autonomous collision avoidance for Mars rover vehicle

    NASA Technical Reports Server (NTRS)

    Murphy, Michael G.

    1990-01-01

    NASA is currently involved with planning unmanned missions to Mars to investigate the terrain and process soil samples in advance of a manned mission. A key issue involved in unmanned surface exploration on Mars is that of supporting autonomous maneuvering since radio communication involves lengthy delays. It is anticipated that specific target locations will be designated for sample gathering. In maneuvering autonomously from a starting position to a target position, the rover will need to avoid a variety of obstacles such as boulders or troughs that may block the shortest path to the target. The physical integrity of the rover needs to be maintained while minimizing the time and distance required to attain the target position. Fuzzy logic lends itself well to building reliable control systems that function in the presence of uncertainty or ambiguity. The following major issues are discussed: (1) the nature of fuzzy logic control systems and software tools to implement them; (2) collision avoidance in the presence of fuzzy parameters; and (3) techniques for adaptation in fuzzy logic control systems.

  13. Spatial analysis of groundwater levels using Fuzzy Logic and geostatistical tools

    NASA Astrophysics Data System (ADS)

    Theodoridou, P. G.; Varouchakis, E. A.; Karatzas, G. P.

    2017-12-01

    The spatial variability evaluation of the water table of an aquifer provides useful information in water resources management plans. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram is very important for the optimal method performance. This work compares three different criteria to assess the theoretical variogram that fits to the experimental one: the Least Squares Sum method, the Akaike Information Criterion and the Cressie's Indicator. Moreover, variable distance metrics such as the Euclidean, Minkowski, Manhattan, Canberra and Bray-Curtis are applied to calculate the distance between the observation and the prediction points, that affects both the variogram calculation and the Kriging estimator. A Fuzzy Logic System is then applied to define the appropriate neighbors for each estimation point used in the Kriging algorithm. The two criteria used during the Fuzzy Logic process are the distance between observation and estimation points and the groundwater level value at each observation point. The proposed techniques are applied to a data set of 250 hydraulic head measurements distributed over an alluvial aquifer. The analysis showed that the Power-law variogram model and Manhattan distance metric within ordinary kriging provide the best results when the comprehensive geostatistical analysis process is applied. On the other hand, the Fuzzy Logic approach leads to a Gaussian variogram model and significantly improves the estimation performance. The two different variogram models can be explained in terms of a fractional Brownian motion approach and of aquifer behavior at local scale. Finally, maps of hydraulic head spatial variability and of predictions uncertainty are constructed for the area with the two different approaches comparing their advantages and drawbacks.

  14. Mechanics as the Logical Point of Entry for the Enculturation into Scientific Thinking

    ERIC Educational Resources Information Center

    Carson, Robert; Rowlands, Stuart

    2005-01-01

    Force in modern classical mechanics is unique, both in terms of its logical character and the conceptual difficulties it causes. Force is well defined by a set of axioms that not only structures mechanics but science in general. Force is also the dominant theme in the "misconceptions" literature and many philosophers and physicists alike have…

  15. Beyond the Poverty of National Security: Toward a Critical Human Security Perspective in Educational Policy

    ERIC Educational Resources Information Center

    Means, Alexander J.

    2014-01-01

    This article examines the intersecting logics of human capital and national security underpinning the corporate school reform movement in the United States. Taking a 2012 policy report by the Council on Foreign Relations as an entry point, it suggests that these logics are incoherent not only on their own narrow instrumental terms, but also more…

  16. Lattice Theory, Measures and Probability

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2007-11-01

    In this tutorial, I will discuss the concepts behind generalizing ordering to measuring and apply these ideas to the derivation of probability theory. The fundamental concept is that anything that can be ordered can be measured. Since we are in the business of making statements about the world around us, we focus on ordering logical statements according to implication. This results in a Boolean lattice, which is related to the fact that the corresponding logical operations form a Boolean algebra. The concept of logical implication can be generalized to degrees of implication by generalizing the zeta function of the lattice. The rules of probability theory arise naturally as a set of constraint equations. Through this construction we are able to neatly connect the concepts of order, structure, algebra, and calculus. The meaning of probability is inherited from the meaning of the ordering relation, implication, rather than being imposed in an ad hoc manner at the start.

  17. Fractal dimension and fuzzy logic systems for broken rotor bar detection in induction motors at start-up and steady-state regimes

    NASA Astrophysics Data System (ADS)

    Amezquita-Sanchez, Juan P.; Valtierra-Rodriguez, Martin; Perez-Ramirez, Carlos A.; Camarena-Martinez, David; Garcia-Perez, Arturo; Romero-Troncoso, Rene J.

    2017-07-01

    Squirrel-cage induction motors (SCIMs) are key machines in many industrial applications. In this regard, the monitoring of their operating condition aiming at avoiding damage and reducing economical losses has become a demanding task for industry. In the literature, several techniques and methodologies to detect faults that affect the integrity and performance of SCIMs have been proposed. However, they have only been focused on analyzing either the start-up transient or the steady-state operation regimes, two common operating scenarios in real practice. In this work, a novel methodology for broken rotor bar (BRB) detection in SCIMs during both start-up and steady-state operation regimes is proposed. It consists of two main steps. In the first one, the analysis of three-axis vibration signals using fractal dimension (FD) theory is carried out. Since different FD-based algorithms can give different results, three algorithms named Katz’ FD, Higuchi’s FD, and box dimension, are tested. In the second step, a fuzzy logic system for each regime is presented for automatic diagnosis. To validate the proposal, a motor with different damage levels has been tested: one with a partially BRB, a second with one fully BRB, and the third with two BRBs. The obtained results demonstrate the proposed effectiveness.

  18. Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

    PubMed

    Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole

    2013-10-01

    Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. [New horizons in medicine. The application of "fuzzy logic" in clinical and experimental medicine].

    PubMed

    Guarini, G

    1994-06-01

    In medicine, the study of physiological and physiopathological problems is generally programmed by elaborating models which respond to the principals of formal logic. This gives the advantage of favouring the transformation of the formal model into a mathematical model of reference which responds to the principles of the set theories. All this is in the utopian wish to obtain as a result of each research, a net answer whether positive or negative, according to the Aristotelian principal of tertium non datur. Taking this into consideration, the A. briefly traces the principles of modal logic and, in particular, those of fuzzy logic, proposing that the latter substitute the actual definition of "logic with more truth values", with that perhaps more pertinent of "logic of conditioned possibilities". After a brief synthesis on the state of the art on the application of fuzzy logic, the A. reports an example of graphic expression of fuzzy logic by demonstrating how the basic glycemic data (expressed by the vectors magnitude) revealed in a sample of healthy individuals, constituted on the whole an unbroken continuous stream of set partials. The A. calls attention to fuzzy logic as a useful instrument to elaborate in a new way the analysis of scenario qualified to acquire the necessary information to single out the critical points which characterize the potential development of any biological phenomenon.

  20. Decreased body mass index in the preclinical stage of autosomal dominant Alzheimer's disease.

    PubMed

    Müller, Stephan; Preische, Oliver; Sohrabi, Hamid R; Gräber, Susanne; Jucker, Mathias; Dietzsch, Janko; Ringman, John M; Martins, Ralph N; McDade, Eric; Schofield, Peter R; Ghetti, Bernardino; Rossor, Martin; Graff-Radford, Neill R; Levin, Johannes; Galasko, Douglas; Quaid, Kimberly A; Salloway, Stephen; Xiong, Chengjie; Benzinger, Tammie; Buckles, Virginia; Masters, Colin L; Sperling, Reisa; Bateman, Randall J; Morris, John C; Laske, Christoph

    2017-04-27

    The relationship between body-mass index (BMI) and Alzheimer´s disease (AD) has been extensively investigated. However, BMI alterations in preclinical individuals with autosomal dominant AD (ADAD) have not yet been investigated. We analyzed cross-sectional data from 230 asymptomatic members of families with ADAD participating in the Dominantly Inherited Alzheimer Network (DIAN) study including 120 preclinical mutation carriers (MCs) and 110 asymptomatic non-carriers (NCs). Differences in BMI and their relation with cerebral amyloid load and episodic memory as a function of estimated years to symptom onset (EYO) were analyzed. Preclinical MCs showed significantly lower BMIs compared to NCs, starting 11.2 years before expected symptom onset. However, the BMI curves begun to diverge already at 17.8 years before expected symptom onset. Lower BMI in preclinical MCs was significantly associated with less years before estimated symptom onset, higher global Aβ brain burden, and with lower delayed total recall scores in the logical memory test. The study provides cross-sectional evidence that weight loss starts one to two decades before expected symptom onset of ADAD. Our findings point toward a link between the pathophysiology of ADAD and disturbance of weight control mechanisms. Longitudinal follow-up studies are warranted to investigate BMI changes over time.

  1. Maximum power point tracking techniques for wind energy systems using three levels boost converter

    NASA Astrophysics Data System (ADS)

    Tran, Cuong Hung; Nollet, Frédéric; Essounbouli, Najib; Hamzaoui, Abdelaziz

    2018-05-01

    This paper presents modeling and simulation of three level Boost DC-DC converter in Wind Energy Conversion System (WECS). Three-level Boost converter has significant advantage compared to conventional Boost. A maximum power point tracking (MPPT) method for a variable speed wind turbine using permanent magnet synchronous generator (PMSG) is also presented. Simulation of three-level Boost converter topology with Perturb and Observe algorithm and Fuzzy Logic Control is implemented in MATLAB/SIMULINK. Results of this simulation show that the system with MPPT using fuzzy logic controller has better performance to the Perturb and Observe algorithm: fast response under changing conditions and small oscillation.

  2. Moral Particularism and Deontic Logic

    NASA Astrophysics Data System (ADS)

    Parent, Xavier

    The aim of this paper is to strengthen the point made by Horty about the relationship between reason holism and moral particularism. In the literature prima facie obligations have been considered as the only source of reason holism. I strengthen Horty's point in two ways. First, I show that contrary-to-duties provide another independent support for reason holism. Next I outline a formal theory that is able to capture these two sources of holism. While in simple settings the proposed account coincides with Horty's one, this is not true in more complicated or "realistic" settings in which more than two norms collide. My chosen formalism is so-called input/output logic.

  3. Fuzzy logic control of stand-alone photovoltaic system with battery storage

    NASA Astrophysics Data System (ADS)

    Lalouni, S.; Rekioua, D.; Rekioua, T.; Matagne, E.

    Photovoltaic energy has nowadays an increased importance in electrical power applications, since it is considered as an essentially inexhaustible and broadly available energy resource. However, the output power provided via the photovoltaic conversion process depends on solar irradiation and temperature. Therefore, to maximize the efficiency of the photovoltaic energy system, it is necessary to track the maximum power point of the PV array. The present paper proposes a maximum power point tracker (MPPT) method, based on fuzzy logic controller (FLC), applied to a stand-alone photovoltaic system. It uses a sampling measure of the PV array power and voltage then determines an optimal increment required to have the optimal operating voltage which permits maximum power tracking. This method carries high accuracy around the optimum point when compared to the conventional one. The stand-alone photovoltaic system used in this paper includes two bi-directional DC/DC converters and a lead-acid battery bank to overcome the scare periods. One converter works as an MPP tracker, while the other regulates the batteries state of charge and compensates the power deficit to provide a continuous delivery of energy to the load. The Obtained simulation results show the effectiveness of the proposed fuzzy logic controller.

  4. Fuzzy Logic for Incidence Geometry

    PubMed Central

    2016-01-01

    The paper presents a mathematical framework for approximate geometric reasoning with extended objects in the context of Geography, in which all entities and their relationships are described by human language. These entities could be labelled by commonly used names of landmarks, water areas, and so forth. Unlike single points that are given in Cartesian coordinates, these geographic entities are extended in space and often loosely defined, but people easily perform spatial reasoning with extended geographic objects “as if they were points.” Unfortunately, up to date, geographic information systems (GIS) miss the capability of geometric reasoning with extended objects. The aim of the paper is to present a mathematical apparatus for approximate geometric reasoning with extended objects that is usable in GIS. In the paper we discuss the fuzzy logic (Aliev and Tserkovny, 2011) as a reasoning system for geometry of extended objects, as well as a basis for fuzzification of the axioms of incidence geometry. The same fuzzy logic was used for fuzzification of Euclid's first postulate. Fuzzy equivalence relation “extended lines sameness” is introduced. For its approximation we also utilize a fuzzy conditional inference, which is based on proposed fuzzy “degree of indiscernibility” and “discernibility measure” of extended points. PMID:27689133

  5. Gallia Est Omnis Divisa in Partes Tres (All Gaul Is Divided into Three Parts).

    ERIC Educational Resources Information Center

    Seligson, Gerda

    1979-01-01

    Stresses the need for Latin instruction in the school curriculum today. The history of Latin instruction in the U.S. is traced starting from the time that writing Latin and analyzing texts in terms of grammatical, logical, and compositional categories were emphasized. (NCR)

  6. Integrated payload and mission planning, phase 3. Volume 2: Logic/Methodology for preliminary grouping of spacelab and mixed cargo payloads

    NASA Technical Reports Server (NTRS)

    Rodgers, T. E.; Johnson, J. F.

    1977-01-01

    The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.

  7. A DNA Logic Gate Automaton for Detection of Rabies and Other Lyssaviruses.

    PubMed

    Vijayakumar, Pavithra; Macdonald, Joanne

    2017-07-05

    Immediate activation of biosensors is not always desirable, particularly if activation is due to non-specific interactions. Here we demonstrate the use of deoxyribozyme-based logic gate networks arranged into visual displays to precisely control activation of biosensors, and demonstrate a prototype molecular automaton able to discriminate between seven different genotypes of Lyssaviruses, including Rabies virus. The device uses novel mixed-base logic gates to enable detection of the large diversity of Lyssavirus sequence populations, while an ANDNOT logic gate prevents non-specific activation across genotypes. The resultant device provides a user-friendly digital-like, but molecule-powered, dot-matrix text output for unequivocal results read-out that is highly relevant for point of care applications. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Beyond the Schoolyard: The Role of Parenting Logics, Financial Resources, and Social Institutions in the Social Class Gap in Structured Activity Participation

    ERIC Educational Resources Information Center

    Bennett, Pamela R.; Lutz, Amy C.; Jayaram, Lakshmi

    2012-01-01

    We investigate class differences in youth activity participation with interview, survey, and archival data from a diverse sample of parents (n = 51) in two schools. Findings point toward structural rather than cultural explanations. Working- and middle-class parents overlap in parenting logics about participation, though differ in one respect:…

  9. A Study on the Learning Processes in Discrimination Shift Learning of Children with Mental Retardation: From the Point of Developmental View of "Logical Manipulation by Classification."

    ERIC Educational Resources Information Center

    Kanno, Atsushi

    1989-01-01

    The study was designed to investigate the learning processes in discrimination shift learning, in terms of developmental views of "logical manipulation by classification." Tasks comparing sizes of intradimensional value-classes and comparing sizes of interdimensional value-classes were devised in order to measure subjects' levels of…

  10. Boolean network identification from perturbation time series data combining dynamics abstraction and logic programming.

    PubMed

    Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C

    2016-11-01

    Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. High-NA optical CD metrology on small in-cell targets enabling improved higher order dose control and process control for logic

    NASA Astrophysics Data System (ADS)

    Cramer, Hugo; Mc Namara, Elliott; van Laarhoven, Rik; Jaganatharaja, Ram; de la Fuente, Isabel; Hsu, Sharon; Belletti, Filippo; Popadic, Milos; Tu, Ward; Huang, Wade

    2017-03-01

    The logic manufacturing process requires small in-device metrology targets to exploit the full dose correction potential of the modern scanners and process tools. A high-NA angular resolved scatterometer (YieldStar S-1250D) was modified to demonstrate the possibility of OCD measurements on 5x5µm2 targets. The results obtained on test wafers in a logic manufacturing environment, measured after litho and after core etch, showed a good correlation to larger reference targets and AEI to ADI intra-field CDU correlation, thereby demonstrating the feasibility of OCD on such small targets. The data was used to determine a reduction potential of 55% for the intra-field CD variation, using 145 points per field on a few inner fields, and 33% of the process induced across wafer CD variation using 16 points per field full wafer. In addition, the OCD measurements reveal valuable information on wafer-to-wafer layer height variations within a lot.

  12. Fuzzy logic control of telerobot manipulators

    NASA Technical Reports Server (NTRS)

    Franke, Ernest A.; Nedungadi, Ashok

    1992-01-01

    Telerobot systems for advanced applications will require manipulators with redundant 'degrees of freedom' (DOF) that are capable of adapting manipulator configurations to avoid obstacles while achieving the user specified goal. Conventional methods for control of manipulators (based on solution of the inverse kinematics) cannot be easily extended to these situations. Fuzzy logic control offers a possible solution to these needs. A current research program at SRI developed a fuzzy logic controller for a redundant, 4 DOF, planar manipulator. The manipulator end point trajectory can be specified by either a computer program (robot mode) or by manual input (teleoperator). The approach used expresses end-point error and the location of manipulator joints as fuzzy variables. Joint motions are determined by a fuzzy rule set without requiring solution of the inverse kinematics. Additional rules for sensor data, obstacle avoidance and preferred manipulator configuration, e.g., 'righty' or 'lefty', are easily accommodated. The procedure used to generate the fuzzy rules can be extended to higher DOF systems.

  13. Single axis control of ball position in magnetic levitation system using fuzzy logic control

    NASA Astrophysics Data System (ADS)

    Sahoo, Narayan; Tripathy, Ashis; Sharma, Priyaranjan

    2018-03-01

    This paper presents the design and real time implementation of Fuzzy logic control(FLC) for the control of the position of a ferromagnetic ball by manipulating the current flowing in an electromagnet that changes the magnetic field acting on the ball. This system is highly nonlinear and open loop unstable. Many un-measurable disturbances are also acting on the system, making the control of it highly complex but interesting for any researcher in control system domain. First the system is modelled using the fundamental laws, which gives a nonlinear equation. The nonlinear model is then linearized at an operating point. Fuzzy logic controller is designed after studying the system in closed loop under PID control action. The controller is then implemented in real time using Simulink real time environment. The controller is tuned manually to get a stable and robust performance. The set point tracking performance of FLC and PID controllers were compared and analyzed.

  14. Electronics for CMS Endcap Muon Level-1 Trigger System Phase-1 and HL LHC upgrades

    NASA Astrophysics Data System (ADS)

    Madorsky, A.

    2017-07-01

    To accommodate high-luminosity LHC operation at a 13 TeV collision energy, the CMS Endcap Muon Level-1 Trigger system had to be significantly modified. To provide robust track reconstruction, the trigger system must now import all available trigger primitives generated by the Cathode Strip Chambers and by certain other subsystems, such as Resistive Plate Chambers (RPC). In addition to massive input bandwidth, this also required significant increase in logic and memory resources. To satisfy these requirements, a new Sector Processor unit has been designed. It consists of three modules. The Core Logic module houses the large FPGA that contains the track-finding logic and multi-gigabit serial links for data exchange. The Optical module contains optical receivers and transmitters; it communicates with the Core Logic module via a custom backplane section. The Pt Lookup table (PTLUT) module contains 1 GB of low-latency memory that is used to assign the final Pt to reconstructed muon tracks. The μ TCA architecture (adopted by CMS) was used for this design. The talk presents the details of the hardware and firmware design of the production system based on Xilinx Virtex-7 FPGA family. The next round of LHC and CMS upgrades starts in 2019, followed by a major High-Luminosity (HL) LHC upgrade starting in 2024. In the course of these upgrades, new Gas Electron Multiplier (GEM) detectors and more RPC chambers will be added to the Endcap Muon system. In order to keep up with all these changes, a new Advanced Processor unit is being designed. This device will be based on Xilinx UltraScale+ FPGAs. It will be able to accommodate up to 100 serial links with bit rates of up to 25 Gb/s, and provide up to 2.5 times more logic resources than the device used currently. The amount of PTLUT memory will be significantly increased to provide more flexibility for the Pt assignment algorithm. The talk presents preliminary details of the hardware design program.

  15. Retro-causation, Minimum Contradictions and Non-locality

    NASA Astrophysics Data System (ADS)

    Kafatos, Menas; Nassikas, Athanassios A.

    2011-11-01

    Retro-causation has been experimentally verified by Bem and proposed by Kafatos in the form of space-time non-locality in the quantum framework. Every theory includes, beyond its specific axioms, the principles of logical communication (logical language), through which it is defined. This communication obeys the Aristotelian logic (Classical Logic), the Leibniz Sufficient Reason Principle, and a hidden axiom, which basically states that there is anterior-posterior relationship everywhere in communication. By means of a theorem discussed here, it can be proved that the communication mentioned implies contradictory statements, which can only be transcended through silence, i.e. the absence of any statements. Moreover, the breaking of silence is meaningful through the claim for minimum contradictions, which implies the existence of both a logical and an illogical dimension; contradictions refer to causality, implying its opposite, namely retro-causation, and the anterior posterior axiom, implying space-time non-locality. The purpose of this paper is to outline a framework accounting for retro-causation, through both purely theoretical and reality based points of view.

  16. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews.

    PubMed

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to 'think' conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions.

  17. From Schawlow to Newton: An educational return

    NASA Astrophysics Data System (ADS)

    Sathe, D.

    Newton's laws of motion and his theory of gravitation are known for over 300 years. However, investigations of educators, from various countries and carried out in the last quarter of the 20t h century, show that the Aristotelian ideas keep persisting among students - in spite of learning thes e topics in schools and colleges. In the traditional examinations students do give answers in accordance with Newton's laws but in questionnaires of educators they ignore Newtonian laws unknowingly, and quite naturally give answers along the Aristotelian line of thought. Why do they give such contrasting answers? Should we take for granted that their understanding of Newtonian laws is satisfactory because of their correct answers in traditional exams, though not in questionnaires? Can these contrasting views affect their interest in physics? These are some questions that warrant our attention earnestly, as we gear up for the research and teaching in 21s t century. The author felt the need of focusing attention on the logical aspects of the subject, due to the global character of said problem. His decision was strengthened greatly, in late1970s, by the philosophy of Dennis Sciama and hence author's dedication of a letter to the editor to his memory, in the COSPAR Info. Bulletin /1/. Being a trained biochemist, author started looking for points, missed by the earlier educators - that means author started following the advice of Arthur Schawlow /2/ in late 1970s, though unknowingly. Sadly, author came to know of it after dedicating a lecture to the memory of Abdus Salam in a symposium in Samarkand, Uzbekistan. Therefore he is dedicating this presentation to the memory of Arthur Schawlow. According to the present author, the persistence of Aristotelian ideas and consequent contrasting performances of students are due to the logical conflicts between the basic concepts of physics itself. For example, the conflict between the treatment of uniform circular motion and the concept of work motivate students to ignore the centripetal force as choose the tangential force as the resultant force. This is how the said contrast becomes a logical barrier in the comprehension of uniform circular motion and related topics. More information of work of others will also be provided, for the sake of comparison with author's work - leading to some new directions to be explored in the 21s t century. References: 1. Sathe, Dileep V. [Dec. 2001] COSPAR Information Bulletin #152, p. 53. 2. Chu, Steven, [August 1999] Physics World, V: 12, N: 8, p. 49.

  18. Contradictory Reasoning Network: An EEG and fMRI Study

    PubMed Central

    Thai, Ngoc Jade; Seri, Stefano; Rotshtein, Pia; Tecchio, Franca

    2014-01-01

    Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication. PMID:24667491

  19. Contradictory reasoning network: an EEG and FMRI study.

    PubMed

    Porcaro, Camillo; Medaglia, Maria Teresa; Thai, Ngoc Jade; Seri, Stefano; Rotshtein, Pia; Tecchio, Franca

    2014-01-01

    Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication.

  20. Generic Bluetooth Data Module

    DTIC Science & Technology

    2002-09-01

    to Ref (1). 34 RS232.java Serial Coomunication port class To Bluetooth module HCI.java Host Control Interface class L2CAP.java Logical Link Control...standard protocol for transporting IP datagrams over point-to-point link . It is designed to run over RFCOMM to accomplish point-to-point connections...Control and Adaption Host Controller Interface Link Manager Baseband / Link Controller Radio Figure 2. Bluetooth layers (From Ref. [3].) C

  1. Exploring the Use of a Facebook Page in Anatomy Education

    ERIC Educational Resources Information Center

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional…

  2. Explore the Many Paths to Leadership

    ERIC Educational Resources Information Center

    Crow, Tracy

    2015-01-01

    The road to leadership is not necessarily one that educators plan carefully with a series of logical steps. Certainly some educators start as teachers and then systematically work through a traditional hierarchy on their way to the superintendency. No matter their role or their path, education leaders demand more from themselves and others and…

  3. Derivation of sorting programs

    NASA Technical Reports Server (NTRS)

    Varghese, Joseph; Loganantharaj, Rasiah

    1990-01-01

    Program synthesis for critical applications has become a viable alternative to program verification. Nested resolution and its extension are used to synthesize a set of sorting programs from their first order logic specifications. A set of sorting programs, such as, naive sort, merge sort, and insertion sort, were successfully synthesized starting from the same set of specifications.

  4. Euler's Theorem under the Microscope

    ERIC Educational Resources Information Center

    Tennant, Geoff

    2010-01-01

    "Proofs and refutations: the logic of mathematical discovery" by Imre Lakatos was published posthumously in 1976. This is a fascinating, if somewhat hard to access, book which calls into question many of the assumptions that people make about proof--one may start reading with a clear sense of what mathematical proof is, but almost certainly will…

  5. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  6. Challenging Points of Contact among Supervisor, Mentor Teacher and Teacher Candidates: Conflicting Institutional Expectations

    ERIC Educational Resources Information Center

    Katz, Laurie; Isik-Ercan, Zeynep

    2015-01-01

    Grounded in an ethnographic logic of inquiry utilizing the concept of languaculture, this study explores how cultural differences between a field-based team and the university supervisor led to unanticipated challenges and points of conflict in an early childhood teacher education program in Midwestern United States. By examining points of contact…

  7. 'Best practice' development and transfer in the NHS: the importance of process as well as product knowledge.

    PubMed

    Newell, Sue; Edelman, Linda; Scarbrough, Harry; Swan, Jacky; Bresnen, Mike

    2003-02-01

    A core prescription from the knowledge management movement is that the successful management of organizational knowledge will prevent firms from 'reinventing the wheel', in particular through the transfer of 'best practices'. Our findings challenge this logic. They suggest instead that knowledge is emergent and enacted in practice, and that normally those involved in a given practice have only a partial understanding of the overall practice. Generating knowledge about current practice is therefore a precursor to changing that practice. In this sense, knowledge transfer does not occur independently of or in sequence to knowledge generation, but instead the process of knowledge generation and its transfer are inexorably intertwined. Thus, rather than transferring 'product' knowledge about the new 'best practice' per se, our analysis suggests that it is more useful to transfer 'process' knowledge about effective ways to generate the knowledge of existing practice, which is the essential starting point for attempts to change that practice.

  8. Shuttle user analysis (study 2.2). Volume 4: Standardized subsystem modules analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The capability to analyze payloads constructed of standardized modules was provided for the planning of future mission models. An inventory of standardized module designs previously obtained was used as a starting point. Some of the conclusions and recommendations are: (1) the two growth factor synthesis methods provide logical configurations for satellite type selection; (2) the recommended method is the one that determines the growth factor as a function of the baseline subsystem weight, since it provides a larger growth factor for small subsystem weights and results in a greater overkill due to standardization; (3) the method that is not recommended is the one that depends upon a subsystem similarity selection, since care must be used in the subsystem similarity selection; (4) it is recommended that the application of standardized subsystem factors be limited to satellites with baseline dry weights between about 700 and 6,500 lbs; and (5) the standardized satellite design approach applies to satellites maintainable in orbit or retrieved for ground maintenance.

  9. What is (and Isn't) Wrong with Both the Tension and Shear Failure Models for the Formation of Lineae on Europa

    NASA Technical Reports Server (NTRS)

    Kattenhorn, S. A.

    2004-01-01

    An unresolved problem in the interpretation of lineae on Europa is whether they formed as tension- or shear-fractures. Voyager image analyses led to hypotheses that Europan lineaments are tension cracks induced by tidal deformation of the ice crust. This interpretation continued with Galileo image analyses, with lineae being classified as crust- penetrating tension cracks. Tension fracturing has also been an implicit assumption of nonsynchronous rotation (NSR) studies. However, recent hypotheses invoke shear failure to explain lineae development. If a shear failure mechanism is correct, it will be necessary to re-evaluate any models for the evolution of Europa's crust that are based on tensile failure models, such as NSR estimates. For this reason, it is imperative that the mechanism by which fractures are initiated on Europa be unambiguously unraveled. A logical starting point is an evaluation of the pros and cons of each failure model, highlighting the lines of evidence that are needed to fully justify either model.

  10. Guiding principles of safety as a basis for developing a pharmaceutical safety culture.

    PubMed

    Edwards, Brian; Olsen, Axel K; Whalen, Matthew D; Gold, Marla J

    2007-05-01

    Despite the best efforts of industry and regulatory authorities, the trust of society in the process of medicine development and communication of pharmaceutical risk has ebbed away. In response the US government has called for a culture of compliance while the EU regulators talk of a 'culture of scientific excellence'. However, one of the fundamental problems hindering progress to rebuilding trust based on a pharmaceutical safety culture is the lack of agreement and transparency between all stakeholders as to what is meant by a 'Safety of Medicines'. For that reason, we propose 'Guiding Principles of Safety for Pharmaceuticals' are developed analogous to the way that Chemical Safety has been tackled. A logical starting point would be to examine the Principles outlined by the US Institute of Medicine although we acknowledge that these Principles require further extensive debate and definition. Nevertheless, the Principles should take centre stage in the reform of pharmaceutical development required to restore society's trust.

  11. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  12. The music of morality and logic.

    PubMed

    Mesz, Bruno; Rodriguez Zivic, Pablo H; Cecchi, Guillermo A; Sigman, Mariano; Trevisan, Marcos A

    2015-01-01

    Musical theory has built on the premise that musical structures can refer to something different from themselves (Nattiez and Abbate, 1990). The aim of this work is to statistically corroborate the intuitions of musical thinkers and practitioners starting at least with Plato, that music can express complex human concepts beyond merely "happy" and "sad" (Mattheson and Lenneberg, 1958). To do so, we ask whether musical improvisations can be used to classify the semantic category of the word that triggers them. We investigated two specific domains of semantics: morality and logic. While morality has been historically associated with music, logic concepts, which involve more abstract forms of thought, are more rarely associated with music. We examined musical improvisations inspired by positive and negative morality (e.g., good and evil) and logic concepts (true and false), analyzing the associations between these words and their musical representations in terms of acoustic and perceptual features. We found that music conveys information about valence (good and true vs. evil and false) with remarkable consistency across individuals. This information is carried by several musical dimensions which act in synergy to achieve very high classification accuracy. Positive concepts are represented by music with more ordered pitch structure and lower harmonic and sensorial dissonance than negative concepts. Music also conveys information indicating whether the word which triggered it belongs to the domains of logic or morality (true vs. good), principally through musical articulation. In summary, improvisations consistently map logic and morality information to specific musical dimensions, testifying the capacity of music to accurately convey semantic information in domains related to abstract forms of thought.

  13. The music of morality and logic

    PubMed Central

    Mesz, Bruno; Rodriguez Zivic, Pablo H.; Cecchi, Guillermo A.; Sigman, Mariano; Trevisan, Marcos A.

    2015-01-01

    Musical theory has built on the premise that musical structures can refer to something different from themselves (Nattiez and Abbate, 1990). The aim of this work is to statistically corroborate the intuitions of musical thinkers and practitioners starting at least with Plato, that music can express complex human concepts beyond merely “happy” and “sad” (Mattheson and Lenneberg, 1958). To do so, we ask whether musical improvisations can be used to classify the semantic category of the word that triggers them. We investigated two specific domains of semantics: morality and logic. While morality has been historically associated with music, logic concepts, which involve more abstract forms of thought, are more rarely associated with music. We examined musical improvisations inspired by positive and negative morality (e.g., good and evil) and logic concepts (true and false), analyzing the associations between these words and their musical representations in terms of acoustic and perceptual features. We found that music conveys information about valence (good and true vs. evil and false) with remarkable consistency across individuals. This information is carried by several musical dimensions which act in synergy to achieve very high classification accuracy. Positive concepts are represented by music with more ordered pitch structure and lower harmonic and sensorial dissonance than negative concepts. Music also conveys information indicating whether the word which triggered it belongs to the domains of logic or morality (true vs. good), principally through musical articulation. In summary, improvisations consistently map logic and morality information to specific musical dimensions, testifying the capacity of music to accurately convey semantic information in domains related to abstract forms of thought. PMID:26191020

  14. Plain packaging: a logical progression for tobacco control in one of the world's ‘darkest markets’

    PubMed Central

    Scollo, Michelle; Bayly, Megan; Wakefield, Melanie

    2015-01-01

    The Australian approach to tobacco control has been a comprehensive one, encompassing mass media campaigns, consumer information, taxation policy, access for smokers to smoking cessation advice and pharmaceutical treatments, protection from exposure to tobacco smoke and regulation of promotion. World-first legislation to standardise the packaging of tobacco was a logical next step to further reduce misleadingly reassuring promotion of a product known for the past 50 years to kill a high proportion of its long-term users. Similarly, refreshed, larger pack warnings which started appearing on packs at the end of 2012 were a logical progression of efforts to ensure that consumers are better informed about the health risks associated with smoking. Regardless of the immediate effects of legislation, further progress will continue to require a comprehensive approach to maintain momentum and ensure that government efforts on one front are not undermined by more vigorous efforts and greater investment by tobacco companies elsewhere. PMID:28407604

  15. Leadless Chip Carrier Packaging and CAD/CAM-Supported Wire Wrap Interconnect Technology for Subnanosecond ECL.

    DTIC Science & Technology

    1981-11-01

    Showing Wire . 99 Impregnanted Silicone Rubber Contacts, Chip Carrier, ard Lid 35. Technit Connector For 68-Pad JEDEC Type A Leadless . . 100 Chip Carrier...Points of Various . . . . 124 Solders 4. Composition of Alloys Employed in Dual-In-Line . . . . 128 Package Pins and Plating by Mass Spectrographic...swings, and subnanosecond gate delays and risetimes. Presently, emitter coupled logic (ECL) and current mode logic (CML), both fabricated with silicon tech

  16. Debugging and Logging Services for Defence Service Oriented Architectures

    DTIC Science & Technology

    2012-02-01

    Service A software component and callable end point that provides a logically related set of operations, each of which perform a logical step in a...important to note that in some cases when the fault is identified to lie in uneditable code such as program libraries, or outsourced software services ...debugging is limited to characterisation of the fault, reporting it to the software or service provider and development of work-arounds and management

  17. Précis of bayesian rationality: The probabilistic approach to human reasoning.

    PubMed

    Oaksford, Mike; Chater, Nick

    2009-02-01

    According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.

  18. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews

    PubMed Central

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Background Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to ‘think’ conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. Methods and Findings In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Conclusions Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions. PMID:26575182

  19. The relationship between students critical thinking measured by science virtual test and students logical thinking on eighth grade secondary school

    NASA Astrophysics Data System (ADS)

    Nurismawati, R.; Sanjaya, Y.; Rusyati, L.

    2018-05-01

    The aim of this study is to examine the relationship between students’ critical thinking skill and students’ logical thinking skill of Junior High School students in Tasikmalaya city. The respondent consists of 168 students from eighth grade at three public schools in Tasikmalaya City. Science Virtual Test and Test of Logical Thinking were used in this research study. Science virtual test instrument consist of 26 questions with 5 different topics. IBM SPSS 23.00 program was used for analysis of the data. By the findings; students’ critical thinking skill has significant differences in elements of generating purpose, embodying point of view, utilizing concept and making implication and consequence. By Post Hoc LSD Test, from those four elements, there are significant differences between concrete - transitional groups and transitional – concrete groups. There is positive and weak correlation between students’ critical thinking and students’ logical thinking attainment.

  20. Contributions from sociology of science to mathematics education in Brazil: logic as a system of beliefs

    NASA Astrophysics Data System (ADS)

    de Andrade, Thales Haddad Novaes; Vilela, Denise Silva

    2013-09-01

    In Brazil, mathematics education was associated with Jean Piaget's theory. Scholars in the field of education appropriated Piaget's work in different ways, but usually emphasized logical aspects of thought, which probably lead to an expansion of mathematics education influenced by psychology. This study attempts to extend the range of interlocutions and pose a dialogue between the field of mathematics education in Brazil and the sociology of science proposed by David Bloor. The main point of Bloor's theory is that logical-mathematical knowledge is far from being true and universal and is socially conditioned. In particular we will be discussing the first principle of the strong program, which deals with conditions that generate beliefs promoted by education policies in Brazil, such as the MEC/USAID treaties. In this case the "naturalization of logic" was stimulated by a widespread diffusion of both Piaget studies and the Modern Mathematics Movement.

  1. Regulatory Anatomy: How "Safety Logics" Structure European Transplant Medicine.

    PubMed

    Hoeyer, Klaus

    2015-07-01

    This article proposes the term "safety logics" to understand attempts within the European Union (EU) to harmonize member state legislation to ensure a safe and stable supply of human biological material for transplants and transfusions. With safety logics, I refer to assemblages of discourses, legal documents, technological devices, organizational structures, and work practices aimed at minimizing risk. I use this term to reorient the analytical attention with respect to safety regulation. Instead of evaluating whether safety is achieved, the point is to explore the types of "safety" produced through these logics as well as to consider the sometimes unintended consequences of such safety work. In fact, the EU rules have been giving rise to complaints from practitioners finding the directives problematic and inadequate. In this article, I explore the problems practitioners face and why they arise. In short, I expose the regulatory anatomy of the policy landscape.

  2. Automating generation of textual class definitions from OWL to English.

    PubMed

    Stevens, Robert; Malone, James; Williams, Sandra; Power, Richard; Third, Allan

    2011-05-17

    Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as 'coherent' a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much 'formal ontology' was not liked; and that too much explicit exposure of OWL semantics was also not liked. Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the developers of EFO are sufficiently satisfied with the output that the generated definitions have been incorporated into EFO. Whilst not a substitute for hand-written textual definitions, our generated definitions are a useful starting point. An on-line version of the NLG text definition tool can be found at http://swat.open.ac.uk/tools/. The questionaire and sample generated text definitions may be found at http://mcs.open.ac.uk/nlg/SWAT/bio-ontologies.html.

  3. Automating generation of textual class definitions from OWL to English

    PubMed Central

    2011-01-01

    Background Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. Results To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as ‘coherent’ a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much ‘formal ontology’ was not liked; and that too much explicit exposure of OWL semantics was also not liked. Conclusions Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the developers of EFO are sufficiently satisfied with the output that the generated definitions have been incorporated into EFO. Whilst not a substitute for hand-written textual definitions, our generated definitions are a useful starting point. Availability An on-line version of the NLG text definition tool can be found at http://swat.open.ac.uk/tools/. The questionaire and sample generated text definitions may be found at http://mcs.open.ac.uk/nlg/SWAT/bio-ontologies.html. PMID:21624160

  4. A Logical Process Calculus

    NASA Technical Reports Server (NTRS)

    Cleaveland, Rance; Luettgen, Gerald; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.

  5. Intelligent Process Abnormal Patterns Recognition and Diagnosis Based on Fuzzy Logic.

    PubMed

    Hou, Shi-Wang; Feng, Shunxiao; Wang, Hui

    2016-01-01

    Locating the assignable causes by use of the abnormal patterns of control chart is a widely used technology for manufacturing quality control. If there are uncertainties about the occurrence degree of abnormal patterns, the diagnosis process is impossible to be carried out. Considering four common abnormal control chart patterns, this paper proposed a characteristic numbers based recognition method point by point to quantify the occurrence degree of abnormal patterns under uncertain conditions and a fuzzy inference system based on fuzzy logic to calculate the contribution degree of assignable causes with fuzzy abnormal patterns. Application case results show that the proposed approach can give a ranked causes list under fuzzy control chart abnormal patterns and support the abnormity eliminating.

  6. Reasoning About Digital Circuits.

    DTIC Science & Technology

    1983-07-01

    The dissertation will later examine the logic’s formal syntax and semantics in great depth. Below are a few English - language statements and...function have a fixed point. Temporal lolc as a programming langua " Temporal logic can be used directly a a propamuing language . For example, the ...for a separate "sertion language ." For example, the formula S[(I+- );(I + i -- I) (I+2- I) states that if the variable I twice increaes by I in an

  7. Fuzzy Logic Controlled Solar Module for Driving Three- Phase Induction Motor

    NASA Astrophysics Data System (ADS)

    Afiqah Zainal, Nurul; Sooi Tat, Chan; Ajisman

    2016-02-01

    Renewable energy produced by solar module gives advantages for generated three- phase induction motor in remote area. But, solar module's ou tput is uncertain and complex. Fuzzy logic controller is one of controllers that can handle non-linear system and maximum power of solar module. Fuzzy logic controller used for Maximum Power Point Tracking (MPPT) technique to control Pulse-Width Modulation (PWM) for switching power electronics circuit. DC-DC boost converter used to boost up photovoltaic voltage to desired output and supply voltage source inverter which controlled by three-phase PWM generated by microcontroller. IGBT switched Voltage source inverter (VSI) produced alternating current (AC) voltage from direct current (DC) source to control speed of three-phase induction motor from boost converter output. Results showed that, the output power of solar module is optimized and controlled by using fuzzy logic controller. Besides that, the three-phase induction motor can be drive and control using VSI switching by the PWM signal generated by the fuzzy logic controller. This concluded that the non-linear system can be controlled and used in driving three-phase induction motor.

  8. DNA-Based Dynamic Reaction Networks.

    PubMed

    Fu, Ting; Lyu, Yifan; Liu, Hui; Peng, Ruizi; Zhang, Xiaobing; Ye, Mao; Tan, Weihong

    2018-05-21

    Deriving from logical and mechanical interactions between DNA strands and complexes, DNA-based artificial reaction networks (RNs) are attractive for their high programmability, as well as cascading and fan-out ability, which are similar to the basic principles of electronic logic gates. Arising from the dream of creating novel computing mechanisms, researchers have placed high hopes on the development of DNA-based dynamic RNs and have strived to establish the basic theories and operative strategies of these networks. This review starts by looking back on the evolution of DNA dynamic RNs; in particular' the most significant applications in biochemistry occurring in recent years. Finally, we discuss the perspectives of DNA dynamic RNs and give a possible direction for the development of DNA circuits. Copyright © 2018. Published by Elsevier Ltd.

  9. Mapping the petroleum system - An investigative technique to explore the hydrocarbon fluid system

    USGS Publications Warehouse

    Magoon, L.B.; Dow, W.G.

    2000-01-01

    Creating a petroleum system map includes a series of logical steps that require specific information to explain the origin in time and space of discovered hydrocarbon occurrences. If used creatively, this map provides a basis on which to develop complementary plays and prospects. The logical steps include the characterization of a petroleum system (that is, to identify, map, and name the hydrocarbon fluid system) and the summary of these results on a folio sheet. A petroleum system map is based on the understanding that there are several levels of certainty from "guessing" to "knowing" that specific oil and gas accumulations emanated from a particular pod of active source rock. Levels of certainty start with the close geographic proximity of two or more accumulations, continues with the close stratigraphic proximity, followed by the similarities in bulk properties, and then detailed geochemical properties. The highest level of certainty includes the positive geochemical correlation of the hydrocarbon fluid in the accumulations to the extract of the active source rock. A petroleum system map is created when the following logic is implemented. Implementation starts when the oil and gas accumulations of a petroleum province are grouped stratigraphically and geographically. Bulk and geochemical properties are used to further refine the groups through the determination of genetically related oil and gas types. To this basic map, surface seeps and well shows are added. Similarly, the active source rock responsible for these hydrocarbon occurrences are mapped to further define the extent of the system. A folio sheet constructed for a hypothetical case study of the Deer-Boar(.) petroleum system illustrates this methodology.

  10. Distinguishing between evidence and its explanations in the steering of atomic clocks

    NASA Astrophysics Data System (ADS)

    Myers, John M.; Hadi Madjid, F.

    2014-11-01

    Quantum theory reflects within itself a separation of evidence from explanations. This separation leads to a known proof that: (1) no wave function can be determined uniquely by evidence, and (2) any chosen wave function requires a guess reaching beyond logic to things unforeseeable. Chosen wave functions are encoded into computer-mediated feedback essential to atomic clocks, including clocks that step computers through their phases of computation and clocks in space vehicles that supply evidence of signal propagation explained by hypotheses of spacetimes with metric tensor fields. The propagation of logical symbols from one computer to another requires a shared rhythm-like a bucket brigade. Here we show how hypothesized metric tensors, dependent on guesswork, take part in the logical synchronization by which clocks are steered in rate and position toward aiming points that satisfy phase constraints, thereby linking the physics of signal propagation with the sharing of logical symbols among computers. Recognizing the dependence of the phasing of symbol arrivals on guesses about signal propagation transports logical synchronization from the engineering of digital communications to a discipline essential to physics. Within this discipline we begin to explore questions invisible under any concept of time that fails to acknowledge unforeseeable events. In particular, variation of spacetime curvature is shown to limit the bit rate of logical communication.

  11. HYTESS 2: A Hypothetical Turbofan Engine Simplified Simulation with multivariable control and sensor analytical redundancy

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.

    1986-01-01

    A hypothetical turbofan engine simplified simulation with a multivariable control and sensor failure detection, isolation, and accommodation logic (HYTESS II) is presented. The digital program, written in FORTRAN, is self-contained, efficient, realistic and easily used. Simulated engine dynamics were developed from linearized operating point models. However, essential nonlinear effects are retained. The simulation is representative of the hypothetical, low bypass ratio turbofan engine with an advanced control and failure detection logic. Included is a description of the engine dynamics, the control algorithm, and the sensor failure detection logic. Details of the simulation including block diagrams, variable descriptions, common block definitions, subroutine descriptions, and input requirements are given. Example simulation results are also presented.

  12. Knowledge represented using RDF semantic network in the concept of semantic web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukasova, A., E-mail: alena.lukasova@osu.cz; Vajgl, M., E-mail: marek.vajgl@osu.cz; Zacek, M., E-mail: martin.zacek@osu.cz

    The RDF(S) model has been declared as the basic model to capture knowledge of the semantic web. It provides a common and flexible way to decompose composed knowledge to elementary statements, which can be represented by RDF triples or by RDF graph vectors. From the logical point of view, elements of knowledge can be expressed using at most binary predicates, which can be converted to RDF-triples or graph vectors. However, it is not able to capture implicit knowledge representable by logical formulas. This contribution shows how existing approaches (semantic networks and clausal form logic) can be combined together with RDFmore » to obtain RDF-compatible system with ability to represent implicit knowledge and inference over knowledge base.« less

  13. Starting Point: Pedagogic Resources for Teaching and Learning Economics

    ERIC Educational Resources Information Center

    Maier, Mark H.; McGoldrick, KimMarie; Simkins, Scott P.

    2012-01-01

    This article describes Starting Point: Teaching and Learning Economics, a Web-based portal that makes innovative pedagogic resources and effective teaching practices easily accessible to economists. Starting Point introduces economists to teaching innovations through 16 online modules, each containing a general description of a specific pedagogic…

  14. Program Development from Start-to-Finish: A Case Study of the Healthy Relationship and Marriage Education Training Project

    ERIC Educational Resources Information Center

    Futris, Ted G.; Schramm, David G.

    2015-01-01

    What goes into designing and implementing a successful program? How do both research and practice inform program development? In this article, the process through which a federally funded training curriculum was developed and piloted tested is described. Using a logic model framework, important lessons learned are shared in defining the situation,…

  15. An Investigation Into: I) Active Flow Control for Cold-Start Performance Enhancement of a Pump-Assisted, Capillary-Driven, Two-Phase Cooling Loop II) Surface Tension of n-Pentanol + Water, a Self-Rewetting Working Fluid, From 25 °C to 85 °C

    NASA Astrophysics Data System (ADS)

    Bejarano, Roberto Villa

    Cold-start performance enhancement of a pump-assisted, capillary-driven, two-phase cooling loop was attained using proportional integral and fuzzy logic controls to manage the boiling condition inside the evaporator. The surface tension of aqueous solutions of n-Pentanol, a self-rewetting fluid, was also investigated for enhancing heat transfer performance of capillary driven (passive) thermal devices was also studied. A proportional-integral control algorithm was used to regulate the boiling condition (from pool boiling to thin-film boiling) and backpressure in the evaporator during cold-start and low heat input conditions. Active flow control improved the thermal resistance at low heat inputs by 50% compared to the baseline (constant flow rate) case, while realizing a total pumping power savings of 56%. Temperature overshoot at start-up was mitigated combining fuzzy-logic with a proportional-integral controller. A constant evaporator surface temperature of 60°C with a variation of +/-8°C during start-up was attained with evaporator thermal resistances as low as 0.10 cm2--K/W. The surface tension of aqueous solutions of n-Pentanol, a self-rewetting working fluid, as a function of concentration and temperature were also investigated. Self-rewetting working fluids are promising in two-phase heat transfer applications because they have the ability to passively drive additional working fluid towards the heated surface; thereby increasing the dryout limitations of the thermal device. Very little data is available in literature regarding the surface tension of these fluids due to the complexity involved in fluid handling, heating, and experimentation. Careful experiments were performed to investigate the surface tension of n-Pentanol + water. The concentration and temperature range investigated were from 0.25%wt. to1.8%wt and 25°C to 85°C, respectively.

  16. Bytes and Bias: Eliminating Cultural Stereotypes from Educational Software.

    ERIC Educational Resources Information Center

    Miller-Lachmann, Lyn

    1994-01-01

    Presents a 10-point checklist for choosing children's educational software that is free of cultural bias. Each point is illustrated with examples drawn from currently available software. A sidebar lists 25 educational software programs in the areas of social studies, ecology, math and logic, and language arts from which examples were drawn. (three…

  17. ASIC For Complex Fixed-Point Arithmetic

    NASA Technical Reports Server (NTRS)

    Petilli, Stephen G.; Grimm, Michael J.; Olson, Erlend M.

    1995-01-01

    Application-specific integrated circuit (ASIC) performs 24-bit, fixed-point arithmetic operations on arrays of complex-valued input data. High-performance, wide-band arithmetic logic unit (ALU) designed for use in computing fast Fourier transforms (FFTs) and for performing ditigal filtering functions. Other applications include general computations involved in analysis of spectra and digital signal processing.

  18. Service Provision to Students: Where the Gown Best Fits

    ERIC Educational Resources Information Center

    Schulz, Lucy; Szekeres, Judy

    2008-01-01

    One of the challenges facing those responsible for service provision in universities is ensuring that service is provided at the right point in the organisation. Service delivery points can exist at the school/department level, faculty/division level or central unit/university wide level. This does not always follow organisational logic, common…

  19. Using Spare Logic Resources To Create Dynamic Test Points

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2011-01-01

    A technique has been devised to enable creation of a dynamic set of test points in an embedded digital electronic system. As a result, electronics contained in an application specific circuit [e.g., gate array, field programmable gate array (FPGA)] can be internally probed, even when contained in a closed housing during all phases of test. In the present technique, the test points are not fixed and limited to a small number; the number of test points can vastly exceed the number of buffers or pins, resulting in a compact footprint. Test points are selected by means of spare logic resources within the ASIC(s) and/or FPGA(s). A register is programmed with a command, which is used to select the signals that are sent off-chip and out of the housing for monitoring by test engineers and external test equipment. The register can be commanded by any suitable means: for example, it could be commanded through a command port that would normally be used in the operation of the system. In the original application of the technique, commanding of the register is performed via a MIL-STD-1553B communication subsystem.

  20. Logical errors on proving theorem

    NASA Astrophysics Data System (ADS)

    Sari, C. K.; Waluyo, M.; Ainur, C. M.; Darmaningsih, E. N.

    2018-01-01

    In tertiary level, students of mathematics education department attend some abstract courses, such as Introduction to Real Analysis which needs an ability to prove mathematical statements almost all the time. In fact, many students have not mastered this ability appropriately. In their Introduction to Real Analysis tests, even though they completed their proof of theorems, they achieved an unsatisfactory score. They thought that they succeeded, but their proof was not valid. In this study, a qualitative research was conducted to describe logical errors that students made in proving the theorem of cluster point. The theorem was given to 54 students. Misconceptions on understanding the definitions seem to occur within cluster point, limit of function, and limit of sequences. The habit of using routine symbol might cause these misconceptions. Suggestions to deal with this condition are described as well.

  1. Queueing analysis of a canonical model of real-time multiprocessors

    NASA Technical Reports Server (NTRS)

    Krishna, C. M.; Shin, K. G.

    1983-01-01

    A logical classification of multiprocessor structures from the point of view of control applications is presented. A computation of the response time distribution for a canonical model of a real time multiprocessor is presented. The multiprocessor is approximated by a blocking model. Two separate models are derived: one created from the system's point of view, and the other from the point of view of an incoming task.

  2. Mental models and human reasoning

    PubMed Central

    Johnson-Laird, Philip N.

    2010-01-01

    To be rational is to be able to reason. Thirty years ago psychologists believed that human reasoning depended on formal rules of inference akin to those of a logical calculus. This hypothesis ran into difficulties, which led to an alternative view: reasoning depends on envisaging the possibilities consistent with the starting point—a perception of the world, a set of assertions, a memory, or some mixture of them. We construct mental models of each distinct possibility and derive a conclusion from them. The theory predicts systematic errors in our reasoning, and the evidence corroborates this prediction. Yet, our ability to use counterexamples to refute invalid inferences provides a foundation for rationality. On this account, reasoning is a simulation of the world fleshed out with our knowledge, not a formal rearrangement of the logical skeletons of sentences. PMID:20956326

  3. Microbial Dark Matter Investigations: How Microbial Studies Transform Biological Knowledge and Empirically Sketch a Logic of Scientific Discovery

    PubMed Central

    Bernard, Guillaume; Pathmanathan, Jananan S; Lannes, Romain; Lopez, Philippe; Bapteste, Eric

    2018-01-01

    Abstract Microbes are the oldest and most widespread, phylogenetically and metabolically diverse life forms on Earth. However, they have been discovered only 334 years ago, and their diversity started to become seriously investigated even later. For these reasons, microbial studies that unveil novel microbial lineages and processes affecting or involving microbes deeply (and repeatedly) transform knowledge in biology. Considering the quantitative prevalence of taxonomically and functionally unassigned sequences in environmental genomics data sets, and that of uncultured microbes on the planet, we propose that unraveling the microbial dark matter should be identified as a central priority for biologists. Based on former empirical findings of microbial studies, we sketch a logic of discovery with the potential to further highlight the microbial unknowns. PMID:29420719

  4. A Collection of Technical Studies Completed for the Computer-Aided Acquisition and Logistic Support (CALS) Program Fiscal Year 1988. Volume 3. CGM Registration

    DTIC Science & Technology

    1991-03-01

    the array are used cyclically, that is when the end of the array is reached, the pattern starts over at the beginning. Dashed lines wrap around curves...the dash pattern relative to the start of the path. It is interpreted as a .distance into the dash pattern at which the pattern should be started ...cubic seldom Is drawn using the four points specified. The curve starts at the first point and ends at the fourth point; the second and third point are

  5. Reconfigurable-logic-based fiber channel network card

    NASA Astrophysics Data System (ADS)

    Casselman, Steve

    1996-10-01

    Currently all networking hardware must have predefined tradeoffs between latency and bandwidth. In some applications one feature is more important than the other. We present a system where the tradeoff can be made on a case by case basis. To show this we implement an extremely low latency semaphore passing network within a point to point system.

  6. Metalevel programming in robotics: Some issues

    NASA Technical Reports Server (NTRS)

    Kumarn, A.; Parameswaran, N.

    1987-01-01

    Computing in robotics has two important requirements: efficiency and flexibility. Algorithms for robot actions are implemented usually in procedural languages such as VAL and AL. But, since their excessive bindings create inflexible structures of computation, it is proposed that Logic Programming is a more suitable language for robot programming due to its non-determinism, declarative nature, and provision for metalevel programming. Logic Programming, however, results in inefficient computations. As a solution to this problem, researchers discuss a framework in which controls can be described to improve efficiency. They have divided controls into: (1) in-code and (2) metalevel and discussed them with reference to selection of rules and dataflow. Researchers illustrated the merit of Logic Programming by modelling the motion of a robot from one point to another avoiding obstacles.

  7. Activity Theory as a Tool to Address the Problem of Chemistry's Lack of Relevance in Secondary School Chemical Education

    ERIC Educational Resources Information Center

    Van Aalsvoort, Joke

    2004-01-01

    In a previous article, the problem of chemistry's lack of relevance in secondary chemical education was analysed using logical positivism as a tool. This article starts with the hypothesis that the problem can be addressed by means of activity theory, one of the important theories within the sociocultural school. The reason for this expectation is…

  8. Science Communication in the Post-Expert Digital Age

    NASA Astrophysics Data System (ADS)

    Luers, Amy; Kroodsma, David

    2014-06-01

    Recently, Popular Science disabled its online comments. In explaining this decision, the magazine cited research that showed that online comments, especially uncivil ones, strongly influence readers, often leading to misleading or incorrect interpretations of the articles. Popular Science wrote, "If you carry out those results to their logical end…you start to see why we feel compelled to hit the `off' switch" [Labarre, 2013].

  9. Answering Questions about Complex Events

    DTIC Science & Technology

    2008-12-19

    in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about

  10. Parallel Logic Programming and Parallel Systems Software and Hardware

    DTIC Science & Technology

    1989-07-29

    Conference, Dallas TX. January 1985. (55) [Rous75] Roussel, P., "PROLOG: Manuel de Reference et d’Uilisation", Group d’ Intelligence Artificielle , Universite d...completed. Tools were provided for software development using artificial intelligence techniques. Al software for massively parallel architectures was...using artificial intelligence tech- niques. Al software for massively parallel architectures was started. 1. Introduction We describe research conducted

  11. Self-Aware Computing

    DTIC Science & Technology

    2009-06-01

    to floating point , to multi-level logic. 2 Overview Self-aware computation can be distinguished from existing computational models which are...systems have advanced to the point that the time is ripe to realize such a system. To illustrate, let us examine each of the key aspects of self...servers for each service, there are no single points of failure in the system. If an OS or user core has a failure, one of several introspection cores

  12. Improvements to Earthquake Location with a Fuzzy Logic Approach

    NASA Astrophysics Data System (ADS)

    Gökalp, Hüseyin

    2018-01-01

    In this study, improvements to the earthquake location method were investigated using a fuzzy logic approach proposed by Lin and Sanford (Bull Seismol Soc Am 91:82-93, 2001). The method has certain advantages compared to the inverse methods in terms of eliminating the uncertainties of arrival times and reading errors. In this study, adopting this approach, epicentral locations were determined based on the results of a fuzzy logic space concerning the uncertainties in the velocity models. To map the uncertainties in arrival times into the fuzzy logic space, a trapezoidal membership function was constructed by directly using the travel time difference between the two stations for the P- and S-arrival times instead of the P- and S-wave models to eliminate the need for obtaining information concerning the velocity structure of the study area. The results showed that this method worked most effectively when earthquakes occurred away from a network or when the arrival time data contained phase reading errors. In this study, to resolve the problems related to determining the epicentral locations of the events, a forward modeling method like the grid search technique was used by applying different logical operations (i.e., intersection, union, and their combination) with a fuzzy logic approach. The locations of the events were depended on results of fuzzy logic outputs in fuzzy logic space by searching in a gridded region. The process of location determination with the defuzzification of only the grid points with the membership value of 1 obtained by normalizing all the maximum fuzzy output values of the highest values resulted in more reliable epicentral locations for the earthquakes than the other approaches. In addition, throughout the process, the center-of-gravity method was used as a defuzzification operation.

  13. OBJECT REPRESENTATION, IDENTITY, AND THE PARADOX OF EARLY PERMANENCE: Steps Toward a New Framework.

    PubMed

    Meltzoff, Andrew N; Moore, M Keith

    1998-01-01

    The sensorimotor theory of infancy has been overthrown, but there is little consensus on a replacement. We hypothesize that a capacity for representation is the starting point for infant development, not its culmination. Logical distinctions are drawn between object representation, identity, and permanence. Modern experiments on early object permanence and deferred imitation suggest: (a) even for young infants, representations persist over breaks in sensory contact, (b) numerical identity of objects ( O s) is initially specified by spatiotemporal criteria (place and trajectory), (c) featural and functional identity criteria develop, (d) events are analyzed by comparing representations to current perception, and (e) representation operates both prospectively, anticipating future contacts with an O , and retrospectively, reidentifying an O as the "same one again." A model of the architecture and functioning of the early representational system is proposed. It accounts for young infants' behavior toward absent people and things in terms of their efforts to determine the identity of objects. Our proposal is developmental without denying innate structure and elevates the power of perception and representation while being cautious about attributing complex concepts to young infants.

  14. OBJECT REPRESENTATION, IDENTITY, AND THE PARADOX OF EARLY PERMANENCE: Steps Toward a New Framework

    PubMed Central

    Meltzoff, Andrew N.; Moore, M. Keith

    2013-01-01

    The sensorimotor theory of infancy has been overthrown, but there is little consensus on a replacement. We hypothesize that a capacity for representation is the starting point for infant development, not its culmination. Logical distinctions are drawn between object representation, identity, and permanence. Modern experiments on early object permanence and deferred imitation suggest: (a) even for young infants, representations persist over breaks in sensory contact, (b) numerical identity of objects (Os) is initially specified by spatiotemporal criteria (place and trajectory), (c) featural and functional identity criteria develop, (d) events are analyzed by comparing representations to current perception, and (e) representation operates both prospectively, anticipating future contacts with an O, and retrospectively, reidentifying an O as the “same one again.” A model of the architecture and functioning of the early representational system is proposed. It accounts for young infants’ behavior toward absent people and things in terms of their efforts to determine the identity of objects. Our proposal is developmental without denying innate structure and elevates the power of perception and representation while being cautious about attributing complex concepts to young infants. PMID:25147418

  15. Teaching Special Relativity Without Calculus

    NASA Astrophysics Data System (ADS)

    Ruby, Lawrence

    2009-04-01

    I 2007 many AAPT members received a booklet that is the first chapter of a physics textbook available on a CD. This book espouses the new educational philosophy of teaching special relativity as the first item in the topic of mechanics. Traditionally, special relativity is part of one or more modern physics chapters at the end of the text,2 and very often this material is never utilized due to time constraints. From a logical standpoint, special relativity is important in satellite communications and in cosmology, as well as in modern physics applications such as atomic theory and high-energy physics. The purpose of this paper is to show that the new philosophy can be carried out in a noncalculus physics course, by demonstrating that all of the principal results of special relativity theory can be obtained by simple algebra. To accomplish this, we shall propose alternate derivations for two results that are usually obtained with calculus. Textbooks2 typically obtain the equations for time dilation and for length contraction from simple considerations based on Einstein's second postulate.3 We shall start from this point.

  16. Band-edge positions in G W : Effects of starting point and self-consistency

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Pasquarello, Alfredo

    2014-10-01

    We study the effect of starting point and self-consistency within G W on the band-edge positions of semiconductors and insulators. Compared to calculations based on a semilocal starting point, the use of a hybrid-functional starting point shows a larger quasiparticle correction for both band-edge states. When the self-consistent treatment is employed, the band-gap opening is found to result mostly from a shift of the valence-band edge. Within the non-self-consistent methods, we analyse the performance of empirical and nonempirical schemes in which the starting point is optimally tuned. We further assess the accuracy of the band-edge positions through the calculation of ionization potentials of surfaces. The ionization potentials for most systems are reasonably well described by one-shot calculations. However, in the case of TiO2, we find that the use of self-consistency is critical to obtain a good agreement with experiment.

  17. Dynamic partial reconfiguration of logic controllers implemented in FPGAs

    NASA Astrophysics Data System (ADS)

    Bazydło, Grzegorz; Wiśniewski, Remigiusz

    2016-09-01

    Technological progress in recent years benefits in digital circuits containing millions of logic gates with the capability for reprogramming and reconfiguring. On the one hand it provides the unprecedented computational power, but on the other hand the modelled systems are becoming increasingly complex, hierarchical and concurrent. Therefore, abstract modelling supported by the Computer Aided Design tools becomes a very important task. Even the higher consumption of the basic electronic components seems to be acceptable because chip manufacturing costs tend to fall over the time. The paper presents a modelling approach for logic controllers with the use of Unified Modelling Language (UML). Thanks to the Model Driven Development approach, starting with a UML state machine model, through the construction of an intermediate Hierarchical Concurrent Finite State Machine model, a collection of Verilog files is created. The system description generated in hardware description language can be synthesized and implemented in reconfigurable devices, such as FPGAs. Modular specification of the prototyped controller permits for further dynamic partial reconfiguration of the prototyped system. The idea bases on the exchanging of the functionality of the already implemented controller without stopping of the FPGA device. It means, that a part (for example a single module) of the logic controller is replaced by other version (called context), while the rest of the system is still running. The method is illustrated by a practical example by an exemplary Home Area Network system.

  18. Cosmic logic: a computational model

    NASA Astrophysics Data System (ADS)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  19. The Design of Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, C. Duane; Humphreys, William M.; Fijany, Amir

    2002-01-01

    As transistor geometries are reduced, quantum effects begin to dominate device performance. At some point, transistors cease to have the properties that make them useful computational components. New computing elements must be developed in order to keep pace with Moore s Law. Quantum dot cellular automata (QCA) represent an alternative paradigm to transistor-based logic. QCA architectures that are robust to manufacturing tolerances and defects must be developed. We are developing software that allows the exploration of fault tolerant QCA gate architectures by automating the specification, simulation, analysis and documentation processes.

  20. Knowledge bases built on web languages from the point of view of predicate logics

    NASA Astrophysics Data System (ADS)

    Vajgl, Marek; Lukasová, Alena; Žáček, Martin

    2017-06-01

    The article undergoes evaluation of formal systems created on the base of web (ontology/concept) languages by simplifying the usual approach of knowledge representation within the FOPL, but sharing its expressiveness, semantic correct-ness, completeness and decidability. Evaluation of two of them - that one based on description logic and that one built on RDF model principles - identifies some of the lacks of those formal systems and presents, if possible, corrections of them. Possibilities to build an inference system capable to obtain new further knowledge over given knowledge bases including those describing domains by giant linked domain databases has been taken into account. Moreover, the directions towards simplifying FOPL language discussed here has been evaluated from the point of view of a possibility to become a web language for fulfilling an idea of semantic web.

  1. The Mark, the Thing, and the Object: On What Commands Repetition in Freud and Lacan.

    PubMed

    Van de Vijver, Gertrudis; Bazan, Ariane; Detandt, Sandrine

    2017-01-01

    In Logique du Fantasme , Lacan argues that the compulsion to repeat does not obey the same discharge logic as homeostatic processes. Repetition installs a realm that is categorically different from the one related to homeostatic pleasure seeking, a properly subjective one, one in which the mark "stands for," "takes the place of," what we have ventured to call "an event," and what only in the movement of return, in what Lacan calls a "thinking of repetition," confirms and ever reconfirms this point of no return, which is also a qualitative cut and a structural loss. The kind of "standing for" Lacan intends here with the concept of repetition is certainly not something like an image or a faithful description. No, what Lacan wishes to stress is that this mark is situated at another level, at another place, it is " entstellt ," and as such, it is punctually impinging upon the bodily dynamics without rendering the event, without having an external meta-point of view, but cutting across registers according to a logics that is not the homeostatic memory logics. This paper elaborates on this distinction on the basis of a confrontation with what Freud says about the pleasure principle and its beyond in Beyond the Pleasure Principle , and also takes inspiration from Freud's Project for a Scientific Psychology. We argue that Lacan's theory of enjoyment takes up and generalizes what Freud was after in Beyond the Pleasure Principle with the Wiederholungszwang , and pushes Freud's thoughts to a more articulated point: to the point where a subject is considered to speak only when it has allowed the other, through discourse, to have impacted and cut into his bodily pleasure dynamics.

  2. Temporal and Spatial Variability of Surface Motor Activation Zones in Hemiplegic Patients During Functional Electrical Stimulation Therapy Sessions.

    PubMed

    Malešević, Jovana; Štrbac, Matija; Isaković, Milica; Kojić, Vladimir; Konstantinović, Ljubica; Vidaković, Aleksandra; Dedijer Dujović, Suzana; Kostić, Miloš; Keller, Thierry

    2017-11-01

    The goal of this study was to investigate surface motor activation zones and their temporal variability using an advanced multi-pad functional electrical stimulation system. With this system motor responses are elicited through concurrent activation of electrode matrix pads collectively termed "virtual electrodes" (VEs) with appropriate stimulation parameters. We observed VEs used to produce selective wrist, finger, and thumb extension movements in 20 therapy sessions of 12 hemiplegic stroke patients. The VEs which produce these three selective movements were created manually on the ergonomic multi-pad electrode by experienced clinicians based on visual inspection of the muscle responses. Individual results indicated that changes in VE configuration were required each session for all patients and that overlap in joint movements was evident between some VEs. However, by analyzing group data, we defined the probability distribution over the electrode surface for the three VEs of interest. Furthermore, through Bayesian logic we obtained preferred stimulation zones that are in accordance with our previously reported heuristically obtained results. We have also analyzed the number of active pads and stimulation amplitudes for these three VEs. Presented results provide a basis for an automated electrode calibration algorithm built on a priori knowledge or the starting point for manual selection of stimulation points. © 2017 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  3. Inerton fields: very new ideas on fundamental physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krasnoholovets, Volodymyr

    2010-12-22

    Modern theories of everything, or theories of the grand unification of all physical interactions, try to describe the whole world starting from the first principles of quantum theory. However, the first principles operate with undetermined notions, such as the wave {psi}-function, particle, lepton and quark, de Broglie and Compton wavelengths, mass, electric charge, spin, electromagnetic field, photon, gravitation, physical vacuum, space, etc. From a logical point of view this means that such modern approach to the theory of everything is condemned to failure... Thus, what should we suggest to improve the situation? It seems quite reasonable to develop initially amore » theory of something, which will be able to clarify the major fundamental notions (listed above) that physics operates with every day. What would be a starting point in such approach? Of course a theory of space as such, because particles and all physical fields emerge just from space. After that, when a particle and fields (and hence the fields' carriers) are well defined and introduced in the well defined physical space, different kinds of interactions can be proposed and investigated. Moreover, we must also allow for a possible interaction of a created particle with the space that generated the appearance of the particle. The mathematical studies of Michel Bounias and the author have shown what the real physical space is, how the space is constituted, how it is arranged and what its elements are. Having constructed the real physical space we can then derive whatever we wish, in particular, such basic notions as mass, particle and charge. How are mechanics of such objects (a massive particle, a charged massive particle) organised? The appropriate theory of motion has been called a sub microscopic mechanics of particles, which is developed in the real physical space, not an abstract phase space, as conventional quantum mechanics does. A series of questions arise: can these two mechanics (submicroscopic and conventional quantum mechanics) be unified?, what can such unification bring new for us?, can such submicroscopic mechanics be a starting point for the derivation of the phenomenon of gravity?, can this new theory be a unified physical theory?, does the theory allow experimental verification? These major points have been clarified in detail. And, perhaps, the most intriguing aspect of the theory is the derivation of a new physical field associated with the notion of mass (or rather inertia of a particle, which has been called the inerton field and which represents a real sense of the particle's wave {psi}-function). This field emerges by analogy with the electromagnetic field associated with the notion of the electric charge. Yes, the postulated inerton field has being tested in a series of different experiments. Even more, the inerton field might have a number of practical applications...« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zvi H. Meiksin

    A temporary installation of Transtek's in-mine communications system in the Lake Lynn mine was used in the mine rescue training programs offered by NIOSH in April and May 2002. We developed and implemented a software program that permits point-to-point data transmission through our in-mine system. We also developed a wireless data transceiver for use in a PLC (programmed logic controller) to remotely control long-wall mining equipment.

  5. Optimization of Nanowire-Resistance Load Logic Inverter.

    PubMed

    Hashim, Yasir; Sidek, Othman

    2015-09-01

    This study is the first to demonstrate characteristics optimization of nanowire resistance load inverter. Noise margins and inflection voltage of transfer characteristics are used as limiting factors in this optimization. Results indicate that optimization depends on resistance value. Increasing of load resistor tends to increasing in noise margins until saturation point, increasing load resistor after this point will not improve noise margins significantly.

  6. Teaching critical thinking in a developmental biology course at an American liberal arts college.

    PubMed

    Adams, Dany S

    2003-01-01

    We all expect our students to learn facts and concepts, but more importantly, we want them to learn how to evaluate new information from an educated and skeptical perspective; that is, we want them to become critical thinkers. For many of us who are scientists and teachers, critical thought is either intuitive or we learned it so long ago that it is not at all obvious how to pass on the skills to our students. Explicitly discussing the logic that underlies the experimental basis of developmental biology is an easy and very successful way to teach critical thinking skills. Here, I describe some simple changes to a lecture course that turn the practice of critical thinking into the centerpiece of the learning process. My starting point is the "Evidence and Antibodies" sidelight in Gilbert's Developmental Biology (2000), which I use as an introduction to the ideas of correlation, necessity and sufficiency, and to the kinds of experiments required to gather each type of evidence: observation ("show it"), loss of function ("block it") and gain of function ("move it"). Thereafter, every experiment can be understood quickly by the class and discussed intelligently with a common vocabulary. Both verbal and written reinforcement of these ideas dramatically improve the students' ability to evaluate new information. In particular, they are able to evaluate claims about cause and effect; they become experts at distinguishing between correlation and causation. Because the intellectual techniques are so powerful and the logic so satisfying, the students come to view the critical assessment of knowledge as a fun puzzle and the rigorous thinking behind formulating a question as an exciting challenge.

  7. Towards an ontological representation of morbidity and mortality in Description Logics.

    PubMed

    Santana, Filipe; Freitas, Fred; Fernandes, Roberta; Medeiros, Zulma; Schober, Daniel

    2012-09-21

    Despite the high coverage of biomedical ontologies, very few sound definitions of death can be found. Nevertheless, this concept has its relevance in epidemiology, such as for data integration within mortality notification systems. We here introduce an ontological representation of the complex biological qualities and processes that inhere in organisms transitioning from life to death. We further characterize them by causal processes and their temporal borders. Several representational difficulties were faced, mainly regarding kinds of processes with blurred or fiat borders that change their type in a continuous rather than discrete mode. Examples of such hard to grasp concepts are life, death and its relationships with injuries and diseases. We illustrate an iterative optimization of definitions within four versions of the ontology, so as to stress the typical problems encountered in representing complex biological processes. We point out possible solutions for representing concepts related to biological life cycles, preserving identity of participating individuals, i.e. for a patient in transition from life to death. This solution however required the use of extended description logics not yet supported by tools. We also focus on the interdependencies and need to change further parts if one part is changed. The axiomatic definition of mortality we introduce allows the description of biologic processes related to the transition from healthy to diseased or injured, and up to a final death state. Exploiting such definitions embedded into descriptions of pathogen transmissions by arthropod vectors, the complete sequence of infection and disease processes can be described, starting from the inoculation of a pathogen by a vector, until the death of an individual, preserving the identity of the patient.

  8. Theoretical computer science and the natural sciences

    NASA Astrophysics Data System (ADS)

    Marchal, Bruno

    2005-12-01

    I present some fundamental theorems in computer science and illustrate their relevance in Biology and Physics. I do not assume prerequisites in mathematics or computer science beyond the set N of natural numbers, functions from N to N, the use of some notational conveniences to describe functions, and at some point, a minimal amount of linear algebra and logic. I start with Cantor's transcendental proof by diagonalization of the non enumerability of the collection of functions from natural numbers to the natural numbers. I explain why this proof is not entirely convincing and show how, by restricting the notion of function in terms of discrete well defined processes, we are led to the non algorithmic enumerability of the computable functions, but also-through Church's thesis-to the algorithmic enumerability of partial computable functions. Such a notion of function constitutes, with respect to our purpose, a crucial generalization of that concept. This will make easy to justify deep and astonishing (counter-intuitive) incompleteness results about computers and similar machines. The modified Cantor diagonalization will provide a theory of concrete self-reference and I illustrate it by pointing toward an elementary theory of self-reproduction-in the Amoeba's way-and cellular self-regeneration-in the flatworm Planaria's way. To make it easier, I introduce a very simple and powerful formal system known as the Schoenfinkel-Curry combinators. I will use the combinators to illustrate in a more concrete way the notion introduced above. The combinators, thanks to their low-level fine grained design, will also make it possible to make a rough but hopefully illuminating description of the main lessons gained by the careful observation of nature, and to describe some new relations, which should exist between computer science, the science of life and the science of inert matter, once some philosophical, if not theological, hypotheses are made in the cognitive sciences. In the last section, I come back to self-reference and I give an exposition of its modal logics. This is used to show that theoretical computer science makes those “philosophical hypotheses” in theoretical cognitive science experimentally and mathematically testable.

  9. Colliding with the Speed of Light, Using Low-Energy Photon-Photon Collision Study the Nature of Matter and the universe

    NASA Astrophysics Data System (ADS)

    Zhang, Meggie

    2013-03-01

    Our research discovered logical inconsistence in physics and mathematics. Through reviewing the entire history of physics and mathematics we gained new understanding about our earlier assumptions, which led to a new interpretation of the wave function and quantum physics. We found the existing experimental data supported a 4-dimensional fractal structure of matter and the universe, we found the formation of wave, matter and the universe through the same process started from a single particle, and the process itself is a fractal that contributed to the diversity of matter. We also found physical evidence supporting a not-continuous fractal space structure. The new understanding also led to a reinterpretation of nuclear collision theories, based on this we succeeded a room-temperature low-energy photon-photon collision (RT-LE-PPC), this method allowed us to observe a topological disconnected fractal structure and succeeded a simulation of the formation of matter and the universe which provided evidences for the nature of light and matter and led to a quantum structure interpretation, and we found the formation of the universe started from two particles. However this work cannot be understood with current physics theories due to the logical problems in the current physics theories.

  10. Families of Graph Algorithms: SSSP Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanewala Appuhamilage, Thejaka Amila Jay; Zalewski, Marcin J.; Lumsdaine, Andrew

    2017-08-28

    Single-Source Shortest Paths (SSSP) is a well-studied graph problem. Examples of SSSP algorithms include the original Dijkstra’s algorithm and the parallel Δ-stepping and KLA-SSSP algorithms. In this paper, we use a novel Abstract Graph Machine (AGM) model to show that all these algorithms share a common logic and differ from one another by the order in which they perform work. We use the AGM model to thoroughly analyze the family of algorithms that arises from the common logic. We start with the basic algorithm without any ordering (Chaotic), and then we derive the existing and new algorithms by methodically exploringmore » semantic and spatial ordering of work. Our experimental results show that new derived algorithms show better performance than the existing distributed memory parallel algorithms, especially at higher scales.« less

  11. Approximate reasoning-based learning and control for proximity operations and docking in space

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Jani, Yashvant; Lea, Robert N.

    1991-01-01

    A recently proposed hybrid-neutral-network and fuzzy-logic-control architecture is applied to a fuzzy logic controller developed for attitude control of the Space Shuttle. A model using reinforcement learning and learning from past experience for fine-tuning its knowledge base is proposed. Two main components of this approximate reasoning-based intelligent control (ARIC) model - an action-state evaluation network and action selection network are described as well as the Space Shuttle attitude controller. An ARIC model for the controller is presented, and it is noted that the input layer in each network includes three nodes representing the angle error, angle error rate, and bias node. Preliminary results indicate that the controller can hold the pitch rate within its desired deadband and starts to use the jets at about 500 sec in the run.

  12. Globalization, Educational Targeting, and Stable Inequalities: A Comparative Analysis of Argentina, Brazil, and Chile

    NASA Astrophysics Data System (ADS)

    Rambla, Xavier

    2006-05-01

    The present study analyzes educational targeting in Argentina, Brazil and Chile from a sociological point of view. It shows that a `logic of induction' has become the vehicle for anti-poverty education strategies meant to help targeted groups improve on their own. The analysis explores the influence of the global educational agenda, the empirical connection between the logic of induction and the mechanism of emulation, and the territorial aspects of educational inequalities. Emulation plays a main role inasmuch as the logic of induction leads targeted groups to compare their adverse situation with more privileged groups, which actually legitimizes inequalities. A brief statistical summary completes the study, showing that educational inequality has remained unchanged as far as urban-rural ratios (in Brazil and Chile) and regional disparities (in all three countries) are concerned.

  13. Implementation Of Fuzzy Automated Brake Controller Using TSK Algorithm

    NASA Astrophysics Data System (ADS)

    Mittal, Ruchi; Kaur, Magandeep

    2010-11-01

    In this paper an application of Fuzzy Logic for Automatic Braking system is proposed. Anti-blocking system (ABS) brake controllers pose unique challenges to the designer: a) For optimal performance, the controller must operate at an unstable equilibrium point, b) Depending on road conditions, the maximum braking torque may vary over a wide range, c) The tire slippage measurement signal, crucial for controller performance, is both highly uncertain and noisy. A digital controller design was chosen which combines a fuzzy logic element and a decision logic network. The controller identifies the current road condition and generates a command braking pressure signal Depending upon the speed and distance of train. This paper describes design criteria, and the decision and rule structure of the control system. The simulation results present the system's performance depending upon the varying speed and distance of the train.

  14. Development of a methodology for assessing the safety of embedded software systems

    NASA Technical Reports Server (NTRS)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  15. The logic of relations and the logic of management.

    PubMed

    Buntinx, W

    2008-07-01

    Increasing emphasis on financial and administrative control processes is affecting service culture in support organisations for persons with intellectual disability. This phenomenon is currently obvious in Dutch service organisations that find themselves in transition towards more community care and at the same time under pressure from new administrative and funding managerial bureaucracy. As a result, the logic of management is becoming more dominant in direct support settings and risk to overshadow the logic of relationships between staff and clients. The article presents a reflection on this phenomenon, starting from a description of service team characteristics as found in the literature. Next, findings about direct support staff (DSS) continuity are summarised from four Dutch studies. Following up these findings, the concept of 'microsystems' is explored as a possible answer to the organisational challenges demonstrated in the studies. Team characteristics, especially team size and membership continuity for DSS, appear relevant factors for assuring supportive relationships and service quality in direct support teams. The structure of the primary support team shows to be of special interest. The organisational concept of 'microsystems' is explored with respect to transcending the present conflict between bureaucratic managerial pressure and the need for supportive relationships. Service organisations need to create structural conditions for the efficacy of direct support teams in terms of client relationships and relevant client outcomes. At the same time, the need for administrative and control processes can not be denied. The concept of 'microsystems', application of a Quality of Life framework and the use of new instruments, such as the Supports Intensity Scale, can contribute to an organisational solution for the present conflicting logic of relations and management.

  16. Logic programming to infer complex RNA expression patterns from RNA-seq data.

    PubMed

    Weirick, Tyler; Militello, Giuseppe; Ponomareva, Yuliya; John, David; Döring, Claudia; Dimmeler, Stefanie; Uchida, Shizuka

    2018-03-01

    To meet the increasing demand in the field, numerous long noncoding RNA (lncRNA) databases are available. Given many lncRNAs are specifically expressed in certain cell types and/or time-dependent manners, most lncRNA databases fall short of providing such profiles. We developed a strategy using logic programming to handle the complex organization of organs, their tissues and cell types as well as gender and developmental time points. To showcase this strategy, we introduce 'RenalDB' (http://renaldb.uni-frankfurt.de), a database providing expression profiles of RNAs in major organs focusing on kidney tissues and cells. RenalDB uses logic programming to describe complex anatomy, sample metadata and logical relationships defining expression, enrichment or specificity. We validated the content of RenalDB with biological experiments and functionally characterized two long intergenic noncoding RNAs: LOC440173 is important for cell growth or cell survival, whereas PAXIP1-AS1 is a regulator of cell death. We anticipate RenalDB will be used as a first step toward functional studies of lncRNAs in the kidney.

  17. Logical synchronization: how evidence and hypotheses steer atomic clocks

    NASA Astrophysics Data System (ADS)

    Myers, John M.; Madjid, F. Hadi

    2014-05-01

    A clock steps a computer through a cycle of phases. For the propagation of logical symbols from one computer to another, each computer must mesh its phases with arrivals of symbols from other computers. Even the best atomic clocks drift unforeseeably in frequency and phase; feedback steers them toward aiming points that depend on a chosen wave function and on hypotheses about signal propagation. A wave function, always under-determined by evidence, requires a guess. Guessed wave functions are coded into computers that steer atomic clocks in frequency and position—clocks that step computers through their phases of computations, as well as clocks, some on space vehicles, that supply evidence of the propagation of signals. Recognizing the dependence of the phasing of symbol arrivals on guesses about signal propagation elevates `logical synchronization.' from its practice in computer engineering to a dicipline essential to physics. Within this discipline we begin to explore questions invisible under any concept of time that fails to acknowledge the unforeseeable. In particular, variation of spacetime curvature is shown to limit the bit rate of logical communication.

  18. Privacy Impact Assessment for the eDiscovery Service

    EPA Pesticide Factsheets

    This system collects Logical Evidence Files, which include data from workstations, laptops, SharePoint and document repositories. Learn how the data is collected, used, who has access, the purpose of data collection, and record retention policies.

  19. Reconciling pairs of concurrently used clinical practice guidelines using Constraint Logic Programming.

    PubMed

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.

  20. Fast logic?: Examining the time course assumption of dual process theory.

    PubMed

    Bago, Bence; De Neys, Wim

    2017-01-01

    Influential dual process models of human thinking posit that reasoners typically produce a fast, intuitive heuristic (i.e., Type-1) response which might subsequently be overridden and corrected by slower, deliberative processing (i.e., Type-2). In this study we directly tested this time course assumption. We used a two response paradigm in which participants have to give an immediate answer and afterwards are allowed extra time before giving a final response. In four experiments we used a range of procedures (e.g., challenging response deadline, concurrent load) to knock out Type 2 processing and make sure that the initial response was intuitive in nature. Our key finding is that we frequently observe correct, logical responses as the first, immediate response. Response confidence and latency analyses indicate that these initial correct responses are given fast, with high confidence, and in the face of conflicting heuristic responses. Findings suggest that fast and automatic Type 1 processing also cues a correct logical response from the start. We sketch a revised dual process model in which the relative strength of different types of intuitions determines reasoning performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Fuzzy – PI controller to control the velocity parameter of Induction Motor

    NASA Astrophysics Data System (ADS)

    Malathy, R.; Balaji, V.

    2018-04-01

    The major application of Induction motor includes the usage of the same in industries because of its high robustness, reliability, low cost, highefficiency and good self-starting capability. Even though it has the above mentioned advantages, it also have some limitations: (1) the standard motor is not a true constant-speed machine, itsfull-load slip varies less than 1 % (in high-horsepower motors).And (2) it is not inherently capable of providing variable-speedoperation. In order to solve the above mentioned problem smart motor controls and variable speed controllers are used. Motor applications involve non linearity features, which can be controlled by Fuzzy logic controller as it is capable of handling those features with high efficiency and it act similar to human operator. This paper presents individuality of the plant modelling. The fuzzy logic controller (FLC)trusts on a set of linguistic if-then rules, a rule-based Mamdani for closed loop Induction Motor model. Themotor model is designed and membership functions are chosenaccording to the parameters of the motor model. Simulation results contains non linearity in induction motor model. A conventional PI controller iscompared practically to fuzzy logic controller using Simulink.

  2. A Logical Basis In The Layered Computer Vision Systems Model

    NASA Astrophysics Data System (ADS)

    Tejwani, Y. J.

    1986-03-01

    In this paper a four layer computer vision system model is described. The model uses a finite memory scratch pad. In this model planar objects are defined as predicates. Predicates are relations on a k-tuple. The k-tuple consists of primitive points and relationship between primitive points. The relationship between points can be of the direct type or the indirect type. Entities are goals which are satisfied by a set of clauses. The grammar used to construct these clauses is examined.

  3. Living in the WOW of Your Ideas

    ERIC Educational Resources Information Center

    Mabry, M. Parker

    2012-01-01

    Recently, the author got to thinking about some of the ideas that have crossed her mind in the last couple of weeks. The list made her smile. And as she went over it point by point in her head she tried to determine what, if any, reasonable or logical patterns were emerging in her myriad of ideas. The four divergent ideas presented in this article…

  4. Simulation of the MELiSSA closed loop system as a tool to define its integration strategy

    NASA Astrophysics Data System (ADS)

    Poughon, Laurent; Farges, Berangere; Dussap, Claude-Gilles; Godia, Francesc; Lasseur, Christophe

    Inspired from a terrestrial ecosystem, MELiSSA (Micro Ecological Life Support System Alternative) is a project of closed life support system future long-term manned missions (Moon and Mars bases). Started on ESA in 1989, this 5 compartments concept has evolved following a mechanistic engineering approach for acquiring both theoretical and technical knowledge. In its current state of development the project can now start to demonstrate the MELiSSA loop concept at a pilot scale. Thus an integration strategy for a MELiSSA Pilot Plant (MPP) was defined, describing the different phases for tests and connections between compartments. The integration steps should be started in 2008 and be completed with a complete operational loop in 2015, which final objective is to achieve a closed liquid and gas loop with 100 Although the integration logic could start with the most advanced processes in terms of knowledge and hardware development, this logic needs to be completed by high politic of simulation. Thanks to this simulation exercise, the effective demonstrations of each independent process and its progressive coupling with others will be performed in operational conditions as close as possible to the final configuration. The theoretical approach described in this paper is based on mass balance models of each of the MELiSSA biological compartments which are used to simulate each integration step and the complete MPP loop itself. These simulations will help to identify criticalities of each integration steps and to check the consistencies between objectives, flows, recycling efficiencies and sizing of the pilot reactors. A MPP scenario compatible with the current knowledge of the operation of the pilot reactors was investigated and the theoretical performances of the system compared to the objectives of the MPP. From this scenario the most important milestone steps in the integration are highlighted and their behaviour can be simulated.

  5. AgRISTARS: Foreign commodity production forecasting. Corn/soybean decision logic development and testing

    NASA Technical Reports Server (NTRS)

    Dailey, C. L.; Abotteen, K. M. (Principal Investigator)

    1980-01-01

    The development and testing of an analysis procedure which was developed to improve the consistency and objectively of crop identification using Landsat data is described. The procedure was developed to identify corn and soybean crops in the U.S. corn belt region. The procedure consists of a series of decision points arranged in a tree-like structure, the branches of which lead an analyst to crop labels. The specific decision logic is designed to maximize the objectively of the identification process and to promote the possibility of future automation. Significant results are summarized.

  6. An efficient temporal logic for robotic task planning

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey M.

    1989-01-01

    Computations required for temporal reasoning can be prohibitively expensive if fully general representations are used. Overly simple representations, such as totally ordered sequence of time points, are inadequate for use in a nonlinear task planning system. A middle ground is identified which is general enough to support a capable nonlinear task planner, but specialized enough that the system can support online task planning in real time. A Temporal Logic System (TLS) was developed during the Intelligent Task Automation (ITA) project to support robotic task planning. TLS is also used within the ITA system to support plan execution, monitoring, and exception handling.

  7. Exploring the Feasibility of a DNA Computer: Design of an ALU Using Sticker-Based DNA Model.

    PubMed

    Sarkar, Mayukh; Ghosal, Prasun; Mohanty, Saraju P

    2017-09-01

    Since its inception, DNA computing has advanced to offer an extremely powerful, energy-efficient emerging technology for solving hard computational problems with its inherent massive parallelism and extremely high data density. This would be much more powerful and general purpose when combined with other existing well-known algorithmic solutions that exist for conventional computing architectures using a suitable ALU. Thus, a specifically designed DNA Arithmetic and Logic Unit (ALU) that can address operations suitable for both domains can mitigate the gap between these two. An ALU must be able to perform all possible logic operations, including NOT, OR, AND, XOR, NOR, NAND, and XNOR; compare, shift etc., integer and floating point arithmetic operations (addition, subtraction, multiplication, and division). In this paper, design of an ALU has been proposed using sticker-based DNA model with experimental feasibility analysis. Novelties of this paper may be in manifold. First, the integer arithmetic operations performed here are 2s complement arithmetic, and the floating point operations follow the IEEE 754 floating point format, resembling closely to a conventional ALU. Also, the output of each operation can be reused for any next operation. So any algorithm or program logic that users can think of can be implemented directly on the DNA computer without any modification. Second, once the basic operations of sticker model can be automated, the implementations proposed in this paper become highly suitable to design a fully automated ALU. Third, proposed approaches are easy to implement. Finally, these approaches can work on sufficiently large binary numbers.

  8. An Invitation to Open Innovation in Malaria Drug Discovery: 47 Quality Starting Points from the TCAMS.

    PubMed

    Calderón, Félix; Barros, David; Bueno, José María; Coterón, José Miguel; Fernández, Esther; Gamo, Francisco Javier; Lavandera, José Luís; León, María Luisa; Macdonald, Simon J F; Mallo, Araceli; Manzano, Pilar; Porras, Esther; Fiandor, José María; Castro, Julia

    2011-10-13

    In 2010, GlaxoSmithKline published the structures of 13533 chemical starting points for antimalarial lead identification. By using an agglomerative structural clustering technique followed by computational filters such as antimalarial activity, physicochemical properties, and dissimilarity to known antimalarial structures, we have identified 47 starting points for lead optimization. Their structures are provided. We invite potential collaborators to work with us to discover new clinical candidates.

  9. The service-driven service company.

    PubMed

    Schlesinger, L A; Heskett, J L

    1991-01-01

    For more than 40 years, service companies like McDonald's prospered with organizations designed according to the principles of traditional mass-production manufacturing. Today that model is obsolete. It inevitably degrades the quality of service a company can provide by setting in motion a cycle of failure that produces dissatisfied customers, unhappy employees, high turnover among both--and so lower profits and lower productivity overall. The cycle starts with human resource policies that minimize the contributions frontline workers can make: jobs are designed to be idiot-proof. Technology is used largely for monitoring and control. Pay is poor. Training is minimal. Performance expectations are abysmally low. Today companies like Taco Bell, Dayton Hudson, and ServiceMaster are reversing the cycle of failure by putting workers with customer contact first and designing the business system around them. As a result, they are developing a model that replaces the logic of industrialization with a new service-driven logic. This logic: Values investments in people as much as investments in technology--and sometimes more. Uses technology to support the efforts of workers on the front lines, not just to monitor or replace them. Makes recruitment and training crucial for everyone. Links compensation to performance for employees at every level. To justify these investments, the new logic draws on innovative data such as the incremental profits of loyal customers and the total costs of lost employees. Its benefits are becoming clear in higher profits and higher pay--results that competitors bound to the old industrial model will not be able to match.

  10. Value innovation: the strategic logic of high growth.

    PubMed

    Kim, W C; Mauborgne, R

    1997-01-01

    Why are some companies able to sustain high growth in revenues and profits--and others are not? To answer that question, the authors, both of INSEAD, spent five years studying more than 30 companies around the world. They found that the difference between the high-growth companies and their less successful competitors was in each group's assumptions about strategy. Managers of the less successful companies followed conventional strategic logic. Managers of the high-growth companies followed what the authors call the logic of value innovation. Conventional strategic logic and value innovation differ along the basic dimensions of strategy. Many companies take their industry's conditions as given; value innovators don't. Many companies let competitors set the parameters of their strategic thinking; value innovators do not use rivals as benchmarks. Rather than focus on the differences among customers, value innovators look for what customers value in common. Rather than view opportunities through the lens of existing assets and capabilities, value innovators ask, What if we start anew? The authors tell the story of the French hotelier Accor, which discarded the notion of what a hotel is supposed to look like in order to offer what most customers want: a good night's sleep at a low price. And Virgin Atlantic challenged industry conventions by eliminating first-class service and channeling savings into innovations for business-class passengers. Those companies didn't set out to build advantages over the competition, but they ended up achieving the greatest competitive advantages.

  11. Structural Area Inspection Frequency Evaluation (SAIFE). Volume III. Demonstration Input, Inspection Survey, and MRR Data

    DTIC Science & Technology

    1978-04-01

    3 1.7 Production Rate Change Time . . . . 3 1.8 Time of Fatigue Test Start . ..... 3 1.9 Fatigue Test Acceleration Factor . 3 1.10 Corrosion...simulation logic. SAIFE accounts for the following factors : (1) aircraft design analysis; (2) component and full-scale fatigue testing; (3) production ...reliability; production , servi ce,Information Service, Springfield, and corrosion defects; crack or corrosi on Virginia 22151 detection probability; crack

  12. Direct Digital Control of HVAC (Heating, Ventilating, and Air Conditioning).

    DTIC Science & Technology

    1985-01-01

    controller func- tions such as time-of-day, economizer cycles, reset, load shedding, chiller optimization , VAV fan synchronization, and optimum start/stop...control system such as that illustrated in Fig- urc 4. Data on setpoints , reset schedules, and event timing, such as that presented in Figure 6, are...program code (Figure 7). In addition to the control logic, setpoint and other data are readily available. Program logi:, setpoint and schedule data, and

  13. Symposium Proceedings: Productivity Enhancement: Personnel Performance Assessment in Navy Systems, held October 12-14, 1977,

    DTIC Science & Technology

    1977-01-01

    principles apply; however, special attention has to be given early in ana- ivsis to the number and kinds of discriminations required of the human observer...demands, to store, or to output desired information. Typically, these are not insurmountable problems, but they have to receive their due attention ... attention to calibration, data identification, noise, drift, and measureuent start/stop logic. Manual systems require special attention to the reliability of

  14. Performance bounds on parallel self-initiating discrete-event

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.

  15. Autonomous vehicle motion control, approximate maps, and fuzzy logic

    NASA Technical Reports Server (NTRS)

    Ruspini, Enrique H.

    1993-01-01

    Progress on research on the control of actions of autonomous mobile agents using fuzzy logic is presented. The innovations described encompass theoretical and applied developments. At the theoretical level, results of research leading to the combined utilization of conventional artificial planning techniques with fuzzy logic approaches for the control of local motion and perception actions are presented. Also formulations of dynamic programming approaches to optimal control in the context of the analysis of approximate models of the real world are examined. Also a new approach to goal conflict resolution that does not require specification of numerical values representing relative goal importance is reviewed. Applied developments include the introduction of the notion of approximate map. A fuzzy relational database structure for the representation of vague and imprecise information about the robot's environment is proposed. Also the central notions of control point and control structure are discussed.

  16. Heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer.

    PubMed

    Evans, Jonathan St B T; Over, David E

    2010-05-01

    Marewski, Gaissmaier and Gigerenzer (2009) present a review of research on fast and frugal heuristics, arguing that complex problems are best solved by simple heuristics, rather than the application of knowledge and logical reasoning. We argue that the case for such heuristics is overrated. First, we point out that heuristics can often lead to biases as well as effective responding. Second, we show that the application of logical reasoning can be both necessary and relatively simple. Finally, we argue that the evidence for a logical reasoning system that co-exists with simpler heuristic forms of thinking is overwhelming. Not only is it implausible a priori that we would have evolved such a system that is of no use to us, but extensive evidence from the literature on dual processing in reasoning and judgement shows that many problems can only be solved when this form of reasoning is used to inhibit and override heuristic thinking.

  17. Competing Logics and Healthcare

    PubMed Central

    Saks, Mike

    2018-01-01

    This paper offers a short commentary on the editorial by Mannion and Exworthy. The paper highlights the positive insights offered by their analysis into the tensions between the competing institutional logics of standardization and customization in healthcare, in part manifested in the conflict between managers and professionals, and endorses the plea of the authors for further research in this field. However, the editorial is criticized for its lack of a strong societal reference point, the comparative absence of focus on hybridization, and its failure to highlight structural factors impinging on the opposing logics in a broader neo-institutional framework. With reference to the Procrustean metaphor, it is argued that greater stress should be placed on the healthcare user in future health policy. Finally, the case of complementary and alternative medicine is set out which – while not explicitly mentioned in the editorial – most effectively concretizes the tensions at the heart of this analysis of healthcare. PMID:29626406

  18. [A functional analysis of healthcare auditors' skills in Venezuela, 2008].

    PubMed

    Chirinos-Muñoz, Mónica S

    2010-10-01

    Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.

  19. The fundamental downscaling limit of field effect transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mamaluy, Denis, E-mail: mamaluy@sandia.gov; Gao, Xujiao

    2015-05-11

    We predict that within next 15 years a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs) will be reached. Specifically, we show that at room temperatures all FETs, irrespective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths. These findings were confirmed by performing quantum mechanical transport simulations for a variety of 6-, 5-, and 4-nm gate length Si devices, optimized to satisfy high-performance logic specifications by ITRS. Different channel materials and wafer/channel orientations have also been studied; it is found that altering channel-source-drain materials achieves only insignificant increasemore » in switching energy, which overall cannot sufficiently delay the approaching downscaling limit. Alternative possibilities are discussed to continue the increase of logic element densities for room temperature operation below the said limit.« less

  20. The fundamental downscaling limit of field effect transistors

    DOE PAGES

    Mamaluy, Denis; Gao, Xujiao

    2015-05-12

    We predict that within next 15 years a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs) will be reached. Specifically, we show that at room temperatures all FETs, irrespective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths. These findings were confirmed by performing quantum mechanical transport simulations for a variety of 6-, 5-, and 4-nm gate length Si devices, optimized to satisfy high-performance logic specifications by ITRS. Different channel materials and wafer/channel orientations have also been studied; it is found that altering channel-source-drain materials achieves only insignificant increasemore » in switching energy, which overall cannot sufficiently delay the approaching downscaling limit. Alternative possibilities are discussed to continue the increase of logic element densities for room temperature operation below the said limit.« less

  1. Vapor cycle energy system for implantable circulatory assist devices. Annual progress report, Jul 1975--May 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watelet, R.P.; Ruggles, A.E.; Hagen, K.G.

    1976-05-01

    The development status of a heart assist system driven by a nuclear fueled, electronically controlled vapor cycle engine termed the tidal regenerator engine (TRE) is described. The TRE pressurization is controlled by a torque motor coupled to a displacer. The electrical power for the sensor, electronic logic and actuator is provided by thermoelectric modules interposed between the engine superheater and boiler. The TRE is direct coupled to an assist blood pump which also acts as a blood-cooled heat exchanger, pressure-volume transformer and sensor for the electronic logic. Engine cycle efficiency in excess of 14% has been demonstrated routinely. Overall systemmore » efficiency on 33 watts of over 9% has been demonstrated. A binary version of this engine in the annular configuration is now being tested. The preliminary tests demonstrated 10% cycle efficiency on the first buildup which ran well and started easily.« less

  2. Activity theory as a tool to address the problem of chemistry's lack of relevance in secondary school chemical education

    NASA Astrophysics Data System (ADS)

    van Aalsvoort, Joke

    In a previous article, the problem of chemistry's lack of relevance in secondary chemical education was analysed using logical positivism as a tool. This article starts with the hypothesis that the problem can be addressed by means of activity theory, one of the important theories within the sociocultural school. The reason for this expectation is that, while logical positivism creates a divide between science and society, activity theory offers a model of society in which science and society are related. With the use of this model, a new course for grade nine has been constructed. This results in a confirmation of the hypothesis, at least at a theoretical level. A comparison with the Salters' approach is made in order to demonstrate the relative merits of a mediated way of dealing with the problem of the lack of relevance of chemistry in chemical education.

  3. Floating point only SIMD instruction set architecture including compare, select, Boolean, and alignment operations

    DOEpatents

    Gschwind, Michael K [Chappaqua, NY

    2011-03-01

    Mechanisms for implementing a floating point only single instruction multiple data instruction set architecture are provided. A processor is provided that comprises an issue unit, an execution unit coupled to the issue unit, and a vector register file coupled to the execution unit. The execution unit has logic that implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA). The floating point vector registers of the vector register file store both scalar and floating point values as vectors having a plurality of vector elements. The processor may be part of a data processing system.

  4. Physics and operation oriented activities in preparation of the JT-60SA tokamak exploitation

    NASA Astrophysics Data System (ADS)

    Giruzzi, G.; Yoshida, M.; Artaud, J. F.; Asztalos, Ö.; Barbato, E.; Bettini, P.; Bierwage, A.; Boboc, A.; Bolzonella, T.; Clement-Lorenzo, S.; Coda, S.; Cruz, N.; Day, Chr.; De Tommasi, G.; Dibon, M.; Douai, D.; Dunai, D.; Enoeda, M.; Farina, D.; Figini, L.; Fukumoto, M.; Galazka, K.; Galdon, J.; Garcia, J.; Garcia-Muñoz, M.; Garzotti, L.; Gil, C.; Gleason-Gonzalez, C.; Goodman, T.; Granucci, G.; Hayashi, N.; Hoshino, K.; Ide, S.; Imazawa, R.; Innocente, P.; Isayama, A.; Itami, K.; Joffrin, E.; Kamada, Y.; Kamiya, K.; Kawano, Y.; Kawashima, H.; Kobayashi, T.; Kojima, A.; Kubo, H.; Lang, P.; Lauber, Ph.; de la Luna, E.; Maget, P.; Marchiori, G.; Mastrostefano, S.; Matsunaga, G.; Mattei, M.; McDonald, D. C.; Mele, A.; Miyata, Y.; Moriyama, S.; Moro, A.; Nakano, T.; Neu, R.; Nowak, S.; Orsitto, F. P.; Pautasso, G.; Pégourié, B.; Pigatto, L.; Pironti, A.; Platania, P.; Pokol, G. I.; Ricci, D.; Romanelli, M.; Saarelma, S.; Sakurai, S.; Sartori, F.; Sasao, H.; Scannapiego, M.; Shimizu, K.; Shinohara, K.; Shiraishi, J.; Soare, S.; Sozzi, C.; Stępniewski, W.; Suzuki, T.; Suzuki, Y.; Szepesi, T.; Takechi, M.; Tanaka, K.; Terranova, D.; Toma, M.; Urano, H.; Vega, J.; Villone, F.; Vitale, V.; Wakatsuki, T.; Wischmeier, M.; Zagórski, R.

    2017-08-01

    The JT-60SA tokamak, being built under the Broader Approach agreement jointly by Europe and Japan, is due to start operation in 2020 and is expected to give substantial contributions to both ITER and DEMO scenario optimisation. A broad set of preparation activities for an efficient start of the experiments on JT-60SA is being carried out, involving elaboration of the Research Plan, advanced modelling in various domains, feasibility and conception studies of diagnostics and other sub-systems in connection with the priorities of the scientific programme, development and validation of operation tools. The logic and coherence of this approach, as well as the most significant results of the main activities undertaken are presented and summarised.

  5. New Security and Justice Sector Partnership Models: Implications of the Arab Uprisings

    DTIC Science & Technology

    2014-01-01

    clear boiling point that even before the Arab uprisings erupted, Clinton warned regional regimes that they needed to change or risk “sinking into the...but without any clear operational definition of security capac- ity, no consistent logic for allocating funds and determining appropriate expenditure...reference point for gauging performance and determining whether and how program implementation needs to be altered. While this approach is most

  6. An Empirical Analysis of the Effectiveness of Design-Build Construction Contracts

    DTIC Science & Technology

    1993-08-01

    features and benefits with traditional design/bid/build. Several types of design/build organizations are examined, with their relative advantages and... benefits transferable to government contracting? Perhaps the most logical approach would be to compare point by point the various advantages and...consider to be critical to their competitive advantage in the market. This paper opens with a review of the development of design/build, then contrasts its

  7. A preliminary study of molecular dynamics on reconfigurable computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolinski, C.; Trouw, F. R.; Gokhale, M.

    2003-01-01

    In this paper we investigate the performance of platform FPGAs on a compute-intensive, floating-point-intensive supercomputing application, Molecular Dynamics (MD). MD is a popular simulation technique to track interacting particles through time by integrating their equations of motion. One part of the MD algorithm was implemented using the Fabric Generator (FG)[l I ] and mapped onto several reconfigurable logic arrays. FG is a Java-based toolset that greatly accelerates construction of the fabrics from an abstract technology independent representation. Our experiments used technology-independent IEEE 32-bit floating point operators so that the design could be easily re-targeted. Experiments were performed using both non-pipelinedmore » and pipelined floating point modules. We present results for the Altera Excalibur ARM System on a Programmable Chip (SoPC), the Altera Strath EPlS80, and the Xilinx Virtex-N Pro 2VP.50. The best results obtained were 5.69 GFlops at 8OMHz(Altera Strath EPlS80), and 4.47 GFlops at 82 MHz (Xilinx Virtex-II Pro 2VF50). Assuming a lOWpower budget, these results compare very favorably to a 4Gjlop/40Wprocessing/power rate for a modern Pentium, suggesting that reconfigurable logic can achieve high performance at low power on jloating-point-intensivea pplications.« less

  8. Physics Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1978

    1978-01-01

    Discusses some high school physics demonstrations and experiments on a variety of topics: such as uses of the dipole, the vapour-liquid critical point, velocity of sound in metallic rods, the flux-density near a bar magnet, and a different style logic gate using basic units. (GA)

  9. Impacts of visuomotor sequence learning methods on speed and accuracy: Starting over from the beginning or from the point of error.

    PubMed

    Tanaka, Kanji; Watanabe, Katsumi

    2016-02-01

    The present study examined whether sequence learning led to more accurate and shorter performance time if people who are learning a sequence start over from the beginning when they make an error (i.e., practice the whole sequence) or only from the point of error (i.e., practice a part of the sequence). We used a visuomotor sequence learning paradigm with a trial-and-error procedure. In Experiment 1, we found fewer errors, and shorter performance time for those who restarted their performance from the beginning of the sequence as compared to those who restarted from the point at which an error occurred, indicating better learning of spatial and motor representations of the sequence. This might be because the learned elements were repeated when the next performance started over from the beginning. In subsequent experiments, we increased the occasions for the repetitions of learned elements by modulating the number of fresh start points in the sequence after errors. The results showed that fewer fresh start points were likely to lead to fewer errors and shorter performance time, indicating that the repetitions of learned elements enabled participants to develop stronger spatial and motor representations of the sequence. Thus, a single or two fresh start points in the sequence (i.e., starting over only from the beginning or from the beginning or midpoint of the sequence after errors) is likely to lead to more accurate and faster performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. IPACS Electronics: Comments on the Original Design and Current Efforts at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Gowdey, J. C.

    1983-01-01

    The development of the integrated power altitude control system (IPACS) is described. The power bridge was fabricated, and all major parts are in hand. The bridge was tested with a 1/4 HP motor for another program. The PWM, Control Logic, and upper bridge driver power supply are breadboarded and are debugged prior to starting testing on a passive load. The Hall sensor circuit for detecting rotor position is in design.

  11. Direct Digital Control of HVAC (Heating, Ventilating, and Air Conditioning Equipment (User’s Guide)

    DTIC Science & Technology

    1985-01-01

    reset, load shedding, chiller optimization , VAV fan synchronization, and optimum start/stop. The prospective buyer of a DDC system should investigate...current and accurate drawings for a conventional, built-up control system such as that illustrated in Fig- ure 4. Data on setpoints , reset schedules, and...are always available in the form of the computer program code (Figure 7). In addition to the control logic, setpoint and other data are readily

  12. Flexible electronics enters the e-reader market

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2010-02-01

    A company that was spun off from the physics department at the University of Cambridge in the UK 10 years ago released its first product last month. Plastic Logic, founded by Henning Sirringhaus and Richard Friend, launched an electronic reader that can display books, magazines and newspapers on a flexible, lightweight plastic display. The reader commercializes pioneering work first started over 20 years ago at the lab by the two physicists, who are based in the department's optoelectronics group.

  13. Using Fuzzy Logic for Performance Evaluation in Reinforcement Learning

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Khedkar, Pratap S.

    1992-01-01

    Current reinforcement learning algorithms require long training periods which generally limit their applicability to small size problems. A new architecture is described which uses fuzzy rules to initialize its two neural networks: a neural network for performance evaluation and another for action selection. This architecture is applied to control of dynamic systems and it is demonstrated that it is possible to start with an approximate prior knowledge and learn to refine it through experiments using reinforcement learning.

  14. Strategies for Effective Eating Development-SEEDS: Design of an Obesity Prevention Program to Promote Healthy Food Preferences and Eating Self-Regulation in Children From Low-Income Families.

    PubMed

    Hughes, Sheryl O; Power, Thomas G; Beck, Ashley; Betz, Drew; Calodich, Shirley; Goodell, L Suzanne; Hill, Laura G; Hill, Rachael; Jaramillo, J Andrea; Johnson, Susan L; Lanigan, Jane; Lawrence, Adair; Martinez, AnaMaria Diaz; Nesbitt, Merrianneeta; Overath, Irene; Parker, Louise; Ullrich-French, Sarah

    2016-06-01

    To develop a scientifically based childhood obesity prevention program supporting child eating self-regulation and taste preferences. This article describes the research methods for the Strategies for Effective Eating Development program. A logic model is provided that depicts a visual presentation of the activities that will be used to guide the development of the prevention program. Randomized, controlled prevention program, pretest, posttest, 6 months, and 12 months. Two sites: Houston, TX and Pasco, WA. Each trial will last 7 weeks with 8-10 mother-child dyads in each arm (prevention and control). Recruitment at Head Start districts (Texas; n = 160) and Inspire Child Development Center including Early Childhood Education and Head Start (Washington; n = 160). Sixteen trials with 16-20 parent-child dyads per trial will provide adequate power to detect moderate effects. Multicomponent family-based prevention program incorporating a dialogue approach to adult learning and self-determination theory. Child assessments will include observed taste preferences, caloric compensation, and eating in the absence of hunger. Parent assessments will include parent-reported feeding, feeding emotions, acculturation, child eating behaviors, child food preferences, and child dietary intake. Heights and weights will be measured for parent and child. A multilevel growth modeling analysis will be employed to consider the nested nature of the data: time points (level 1) within families (level 2) within trials (level 3). Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  15. Threats: power, family mealtimes, and social influence.

    PubMed

    Hepburn, Alexa; Potter, Jonathan

    2011-03-01

    One of the most basic topics in social psychology is the way one agent influences the behaviour of another. This paper will focus on threats, which are an intensified form of attempted behavioural influence. Despite the centrality to the project of social psychology, little attention has been paid to threats. This paper will start to rectify this oversight. It reviews early examples of the way social psychology handles threats and highlights key limitations and presuppositions about the nature and role of threats. By contrast, we subject them to a programme of empirical research. Data comprise video records of a collection of family mealtimes that include preschool children. Threats are recurrent in this material. A preliminary conceptualization of features of candidate threats from this corpus will be used as an analytic start point. A series of examples are used to explicate basic features and dimensions that build the action of threatening. The basic structure of the threats uses a conditional logic: if the recipient continues problem action/does not initiate required action then negative consequences will be produced by the speaker. Further analysis clarifies how threats differ from warnings and admonishments. Sequential analysis suggests threats set up basic response options of compliance or defiance. However, recipients of threats can evade these options by, for example, reworking the unpleasant upshot specified in the threat, or producing barely minimal compliance. The implications for broader social psychological concerns are explored in a discussion of power, resistance, and asymmetry; the paper ends by reconsidering the way social influence can be studied in social psychology. ©2010 The British Psychological Society.

  16. Taylorism and the Logic of Learning Outcomes

    ERIC Educational Resources Information Center

    Stoller, Aaron

    2015-01-01

    This essay examines the shared philosophical foundations of Fredrick W. Taylor's scientific management principles and the contemporary learning outcomes movement (LOM). It analyses the shared philosophical ground between the focal point of Taylor's system--"the task"--and the conceptualization and deployment of "learning…

  17. A Numerical Optimization Approach for Tuning Fuzzy Logic Controllers

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Garg, Devendra P.

    1998-01-01

    This paper develops a method to tune fuzzy controllers using numerical optimization. The main attribute of this approach is that it allows fuzzy logic controllers to be tuned to achieve global performance requirements. Furthermore, this approach allows design constraints to be implemented during the tuning process. The method tunes the controller by parameterizing the membership functions for error, change-in-error and control output. The resulting parameters form a design vector which is iteratively changed to minimize an objective function. The minimal objective function results in an optimal performance of the system. A spacecraft mounted science instrument line-of-sight pointing control is used to demonstrate results.

  18. A Spiking Neural Network Based Cortex-Like Mechanism and Application to Facial Expression Recognition

    PubMed Central

    Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai

    2012-01-01

    In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism. PMID:23193391

  19. Evolution of Government and Industrial Partnerships to Open the Space Frontier

    NASA Technical Reports Server (NTRS)

    Martin, Gary L.

    2008-01-01

    If the logical extension of the current exploration program is to develop self-sustaining settlements on the Moon and Mars over the next few centuries, then there is a path that takes civilization from its current one planet existence to a multi-world future. By considering the far term goal of space settlements as a desired endpoint and using the current state as a starting point, the policy drivers and potential pathways to the goal of sustainable space settlements can be explored. This paper describes a three-phased evolution of government and industrial partnerships from current day relationships to a time when there are sustainable settlements in space. Phase I details the current state government-led exploration while Phase III describes a desired endpoint of self-sufficient settlements in space. Phase II is an important transition phase, which acts as a bridge between now and the future. This paper discusses the critical evolution that must take place in two key areas to ensure a thriving future in space; space transportation and the right to use space property and resources. This paper focuses on the enabling role of government necessary to achieve United States (U.S.) goals for space exploration and open the frontier.

  20. A spiking neural network based cortex-like mechanism and application to facial expression recognition.

    PubMed

    Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai

    2012-01-01

    In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism.

  1. The Quantification of Consistent Subjective Logic Tree Branch Weights for PSHA

    NASA Astrophysics Data System (ADS)

    Runge, A. K.; Scherbaum, F.

    2012-04-01

    The development of quantitative models for the rate of exceedance of seismically generated ground motion parameters is the target of probabilistic seismic hazard analysis (PSHA). In regions of low to moderate seismicity, the selection and evaluation of source- and/or ground-motion models is often a major challenge to hazard analysts and affected by large epistemic uncertainties. In PSHA this type of uncertainties is commonly treated within a logic tree framework in which the branch weights express the degree-of-belief values of an expert in the corresponding set of models. For the calculation of the distribution of hazard curves, these branch weights are subsequently used as subjective probabilities. However the quality of the results depends strongly on the "quality" of the expert knowledge. A major challenge for experts in this context is to provide weight estimates which are logically consistent (in the sense of Kolmogorov's axioms) and to be aware of and to deal with the multitude of heuristics and biases which affect human judgment under uncertainty. For example, people tend to give smaller weights to each branch of a logic tree the more branches it has, starting with equal weights for all branches and then adjusting this uniform distribution based on his/her beliefs about how the branches differ. This effect is known as pruning bias.¹ A similar unwanted effect, which may even wrongly suggest robustness of the corresponding hazard estimates, will appear in cases where all models are first judged according to some numerical quality measure approach and the resulting weights are subsequently normalized to sum up to one.2 To address these problems, we have developed interactive graphical tools for the determination of logic tree branch weights in form of logically consistent subjective probabilities, based on the concepts suggested in Curtis and Wood (2004).3 Instead of determining the set of weights for all the models in a single step, the computer driven elicitation process is performed as a sequence of evaluations of relative weights for small subsets of models which are presented to the analyst. From these, the distribution of logic tree weights for the whole model set is determined as solution of an optimization problem. The model subset presented to the analyst in each step is designed to maximize the expected information. The result of this process is a set of logically consistent weights together with a measure of confidence determined from the amount of conflicting information which is provided by the expert during the relative weighting process.

  2. Turning Points in Even Start Programs. Occasional Paper #4.

    ERIC Educational Resources Information Center

    Rasinski, Timothy; Padak, Nancy

    To investigate the initial experiences of the various Even Start programs, a project developed a survey that was sent to program coordinators in Ohio. It asked open-ended questions to get descriptions and perceptions of situations that preceded turning point events and the turning point events themselves. Data from eight programs highlighted their…

  3. Compton suppression and event triggering in a commercial data acquisition system

    NASA Astrophysics Data System (ADS)

    Tabor, Samuel; Caussyn, D. D.; Tripathi, Vandana; Vonmoss, J.; Liddick, S. N.

    2012-10-01

    A number of groups are starting to use flash digitizer systems to directly convert the preamplifier signals of high-resolution Ge detectors to a stream of digital data. Some digitizers are also equipped with software constant fraction discriminator algorithms capable of operating on the resulting digital data stream to provide timing information. Because of the dropping cost per channel of these systems, it should now be possible to also connect outputs of the Bismuth Germanate (BGO) scintillators used for Compton suppression to other digitizer inputs so that BGO logic signals can also be available in the same system. This provides the possibility to perform all the Compton suppression and multiplicity trigger logic within the digital system, thus eliminating the need for separate timing filter amplifiers (TFA), constant fraction discriminators (CFD), logic units, and lots of cables. This talk will describe the performance of such a system based on Pixie16 modules from XIA LLC with custom field programmable gate array (FPGA) programming for an array of Compton suppressed single Ge crystal and 4-crystal ``Clover'' detector array along with optional particle detectors. Initial tests of the system have produced results comparable with the current traditional system of individual electronics and peak sensing analog to digital converters. The advantages of the all digital system will be discussed.

  4. The Underrepresentation of African Americans in Army Combat Arms Branches

    DTIC Science & Technology

    2014-12-04

    a starting point for the Army to determine true causality. This monograph is simply reviewing data and identifying correlation, and based on...correlation, assigning causality based on historical information and scholarly literature. These potential causes are not fact, and provide a starting ...1988 is the starting point for the commissioning statistics. Subject matter experts hypothesized that the number African American officers

  5. Efficient Web Services Policy Combination

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Harman, Joseph G.

    2010-01-01

    Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take precedence. Meta-policies are specified in defeasible logic, a computationally efficient non-monotonic logic developed to model human reasoning. One drawback of this method is that at one point the algorithm starts an exhaustive search of all subsets of the set of conclusions of a defeasible theory. Although the propositional defeasible logic has linear complexity, the set of conclusions here may be large, especially in real-life practical cases. This phenomenon leads to an inefficient exponential explosion of complexity. The current process of getting a Web security policy from combination of two meta-policies consists of two steps. The first is generating a new meta-policy that is a composition of the input meta-policies, and the second is mapping the meta-policy onto a security policy. The new algorithm avoids the exhaustive search in the current algorithm, and provides a security policy that matches all requirements of the involved metapolicies.

  6. Knowing Who Your Friends Are: Aspects of the Politics of Logical Empiricism

    ERIC Educational Resources Information Center

    Uebel, Thomas

    2009-01-01

    This paper comments on Reisch's book "How the Cold War Transformed Philosophy of Science." Overall supportive of Reisch's project and perspective, it raises certain points where the data appear inconclusive and either provides additional support or briefly explores some interpretative alternatives.

  7. Reconciling Pairs of Concurrently Used Clinical Practice Guidelines Using Constraint Logic Programming

    PubMed Central

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153

  8. Performing Testicular Self-Examination, Driving Automobiles, and Anxiety: What Is the Logical Link?

    PubMed

    Rovito, Michael J

    2018-05-01

    The debate of whether testicular self-examination (TSE) should be promoted among males generally centers on a harm-benefit corollary. The benefits of TSE include improving health outcomes, inclusive of an increase in both quality of life and knowledge/awareness of potential health concerns, as well as promoting proactivity in achieving wellness. The harms include claims that false-positive results can increase anxiety and produce costs via unnecessary treatments and therapies. Further claims point to the lack of evidence suggesting TSE decreases testicular cancer mortality. This commentary primarily discusses the anxiety portion of this debate from a logic-based perspective. The argument that TSE should not be promoted among males due to the risk of inciting false-positive anxiety appears to be flawed. A 5-point perspective is presented on the illogical discouragement of TSE due to theorized levels of false-positive anxiety while existing evidence suggests late-stage testicular cancer is associated with anxiety and depression.

  9. Apparatus and method for implementing power saving techniques when processing floating point values

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Young Moon; Park, Sang Phill

    An apparatus and method are described for reducing power when reading and writing graphics data. For example, one embodiment of an apparatus comprises: a graphics processor unit (GPU) to process graphics data including floating point data; a set of registers, at least one of the registers of the set partitioned to store the floating point data; and encode/decode logic to reduce a number of binary 1 values being read from the at least one register by causing a specified set of bit positions within the floating point data to be read out as 0s rather than 1s.

  10. High resolution time interval meter

    DOEpatents

    Martin, A.D.

    1986-05-09

    Method and apparatus are provided for measuring the time interval between two events to a higher resolution than reliability available from conventional circuits and component. An internal clock pulse is provided at a frequency compatible with conventional component operating frequencies for reliable operation. Lumped constant delay circuits are provided for generating outputs at delay intervals corresponding to the desired high resolution. An initiation START pulse is input to generate first high resolution data. A termination STOP pulse is input to generate second high resolution data. Internal counters count at the low frequency internal clock pulse rate between the START and STOP pulses. The first and second high resolution data are logically combined to directly provide high resolution data to one counter and correct the count in the low resolution counter to obtain a high resolution time interval measurement.

  11. Static Frequency Converter System Installed and Tested

    NASA Technical Reports Server (NTRS)

    Brown, Donald P.; Sadhukhan, Debashis

    2003-01-01

    A new Static Frequency Converter (SFC) system has been installed and tested at the NASA Glenn Research Center s Central Air Equipment Building to provide consistent, reduced motor start times and improved reliability for the building s 14 large exhausters and compressors. The operational start times have been consistent around 2 min, 20 s per machine. This is at least a 3-min improvement (per machine) over the old variable-frequency motor generator sets. The SFC was designed and built by Asea Brown Boveri (ABB) and installed by Encompass Design Group (EDG) as part of a Construction of Facilities project managed by Glenn (Robert Scheidegger, project manager). The authors designed the Central Process Distributed Control Systems interface and control between the programmable logic controller, solid-state exciter, and switchgear, which was constructed by Gilcrest Electric.

  12. Size Determination of Y2O3 Crystallites in MgO Composite Using Mie Scattering

    DTIC Science & Technology

    2017-11-07

    particle size, and the path length through the material to generate an expected light transmission spectrum. These calculated curves were compared to...materials. In the current work, light transmission data are compared to the theoretical curves generated by the Mie scattering model in an attempt to...Since the authors wanted to compare the model’s predictions to the experimental %T values, it seemed logical to start with Beer’s Law: )exp()1( 2

  13. Electronic computers and telephone exchanges

    NASA Astrophysics Data System (ADS)

    Flowers, T. H.

    1980-01-01

    A retrospective on the telephone, with emphasis on development of digital methods, is presented. Starting with its invention in 1876, major breakthroughs in transmission and switching circuitry are reviewed. The thermionic valve (1917), the Eccles-Jordan trigger circuit (1921), copper oxide rectifiers (1920's), and the gas-tube binary counter (1931) are highlighted. The evolution of logic design in telephone exchanges and the interaction this had with electronic computers is then traced up to the appearance of COLOSSUS, a specialized electronic computer used for cryptanalysis (1943).

  14. A Template Engine for Parsing Objects from Textual Representations

    NASA Astrophysics Data System (ADS)

    Rajković, Milan; Stanković, Milena; Marković, Ivica

    2011-09-01

    Template engines are widely used for separation of business and presentation logic. They are commonly used in web applications for clean rendering of HTML pages. Another area of usage is message formatting in distributed applications where they transform objects to appropriate representations. This paper explores the possibility of using templates for a reverse process—for creating objects starting from their representations. We present the prototype of engine that we have developed, and describe benefits and drawbacks of this approach.

  15. Brainstorming Themes that Connect Art and Ideas across the Curriculum

    ERIC Educational Resources Information Center

    Walling, Donovan R.

    2006-01-01

    Ideas are starting points-for thought, discussion, reading, viewing, writing, and making. The two "brainstorms on paper" presented in this article illustrate how taking an idea and examining it from an artistic point of view can generate thematic starting points to help teachers and students connect the visual arts to ideas that ripple across the…

  16. A label-free and enzyme-free platform with a visible output for constructing versatile logic gates using caged G-quadruplex as the signal transducer.

    PubMed

    Chen, Junhua; Pan, Jiafeng; Chen, Shu

    2018-01-14

    A complete set of binary basic logic gates (OR, AND, NOR, NAND, INHIBT, IMPLICATION, XOR and XNOR) is realized on a label-free and enzyme-free sensing platform using caged G-quadruplex as the signal transducer. In the presence of an appropriate input, the temporarily blocked G-rich sequence in the hairpin DNA is released through cleavage by the synergetically-stabilized Mg 2+ -dependent DNAzyme which can be made to function via the input-guided cooperative conjunction of the DNAzyme subunits. In the presence of hemin, the unblocked G-quadruplex DNAzyme catalyzes the oxidation of 3,3',5,5'-tetramethylbenzidine (TMB) by H 2 O 2 to generate a colored readout signal which can be readily distinguished by the naked eye. This strategy is quite versatile and straightforward for logic operations. Two combinatorial gates (XOR + AND and XOR + NOR) are also successfully fabricated to demonstrate the modularity and scalability of the computing elements. The distinctive advantage of this logic system is that molecular events in aqueous solution could be translated into a color change which can be directly observed by the naked eye without resorting to any analytical instrumentation. Moreover, this work reveals a new route for the design of molecular logic gates that can be executed without any labeling and immobilization procedure or separation and washing step, which holds great promise for intelligent point-of-care diagnostics and in-field applications.

  17. Multi-layered reasoning by means of conceptual fuzzy sets

    NASA Technical Reports Server (NTRS)

    Takagi, Tomohiro; Imura, Atsushi; Ushida, Hirohide; Yamaguchi, Toru

    1993-01-01

    The real world consists of a very large number of instances of events and continuous numeric values. On the other hand, people represent and process their knowledge in terms of abstracted concepts derived from generalization of these instances and numeric values. Logic based paradigms for knowledge representation use symbolic processing both for concept representation and inference. Their underlying assumption is that a concept can be defined precisely. However, as this assumption hardly holds for natural concepts, it follows that symbolic processing cannot deal with such concepts. Thus symbolic processing has essential problems from a practical point of view of applications in the real world. In contrast, fuzzy set theory can be viewed as a stronger and more practical notation than formal, logic based theories because it supports both symbolic processing and numeric processing, connecting the logic based world and the real world. In this paper, we propose multi-layered reasoning by using conceptual fuzzy sets (CFS). The general characteristics of CFS are discussed along with upper layer supervision and context dependent processing.

  18. Floating-Point Numerical Function Generators Using EVMDDs for Monotone Elementary Functions

    DTIC Science & Technology

    2009-01-01

    Villa, R. K. Brayton , and A. L. Sangiovanni- Vincentelli, “Multi-valued decision diagrams: Theory and appli- cations,” Multiple-Valued Logic: An...Shmerko, and R. S. Stankovic, Decision Diagram Techniques for Micro- and Na- noelectronic Design, CRC Press, Taylor & Francis Group, 2006. Appendix

  19. Aspects and the Overlap Function.

    ERIC Educational Resources Information Center

    Levine, Marilyn M.; Levine, Leonard P.

    1984-01-01

    Presents system for automatic handling of ordered sets, states based on these sets, and differing points of view regarding Universe of Discourse. Aspects are represented by new logical "overlap" function with examples taken from Ranganathan's horse and carriage parable and several books involving four main concepts (history, geography,…

  20. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Transport Protocol (Transmission Control Protocol/User Datagram Protocol [TCP/UDP]) Analysis

    DTIC Science & Technology

    2015-09-01

    the network Mac8 Medium Access Control ( Mac ) (Ethernet) address observed as destination for outgoing packets subsessionid8 Zero-based index of...15. SUBJECT TERMS tactical networks, data reduction, high-performance computing, data analysis, big data 16. SECURITY CLASSIFICATION OF: 17...Integer index of row cts_deid Device (instrument) Identifier where observation took place cts_collpt Collection point or logical observation point on

  1. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  2. Extended- and Point-Source Radiometric Program

    DTIC Science & Technology

    1962-08-08

    aircraft of the U. S. Geo- logical Survey (USGS). Because many sites involved in nuclear activities exist and more are coming into exist- ence, the need of...GZ in Fig. 1.3 was the Ground Zero point of an old nuclear detonation and, unfortunately, was still highly radioactive. The detail of the source...measurements are the most dependable since the instrument was calibrated with Cs 137, Co 6°, and radium at a distance that gave a scattering component

  3. Logic of discovery or psychology of invention?

    NASA Astrophysics Data System (ADS)

    Woodward, James F.

    1992-02-01

    It is noted that Popper separates the creation of concepts, conjectures, hypotheses and theories—the context of invention—from the testing thereof—the context of justification—arguing that only the latter is susceptible of rigorous logical analysis. Efforts on the part of others to shift or eradicate the demarcation established by this distinction are discussed and the relationship of these considerations to the claims of “strong artificial intelligence” is pointed out. It is argued that the mode of education of scientists, as well as reports of celebrated scientists, support Popper's judgement in this matter. An historical episode from Faraday's later career is used to illustrate the historiographical strength of Lakatos' “methodology of research programs.”

  4. Average output polarization dataset for signifying the temperature influence for QCA designed reversible logic circuits.

    PubMed

    Abdullah-Al-Shafi, Md; Bahar, Ali Newaz; Bhuiyan, Mohammad Maksudur Rahman; Shamim, S M; Ahmed, Kawser

    2018-08-01

    Quantum-dot cellular automata (QCA) as nanotechnology is a pledging contestant that has incredible prospective to substitute complementary metal-oxide-semiconductor (CMOS) because of its superior structures such as intensely high device thickness, minimal power depletion with rapid operation momentum. In this study, the dataset of average output polarization (AOP) for fundamental reversible logic circuits is organized as presented in (Abdullah-Al-Shafi and Bahar, 2017; Bahar et al., 2016; Abdullah-Al-Shafi et al., 2015; Abdullah-Al-Shafi, 2016) [1-4]. QCADesigner version 2.0.3 has been utilized to survey the AOP of reversible circuits at separate temperature point in Kelvin (K) unit.

  5. The Swarm Archiving Payload Data Facility, an Instance Configuration of the ESA Multi-Mission Facility

    NASA Astrophysics Data System (ADS)

    Pruin, B.; Martini, A.; Shanmugam, P.; Lopes, C.

    2015-04-01

    The Swarm mission consists of 3 satellites, each carrying an identical set of instruments. The scientific algorithms for processing are organized in 11 separate processing steps including automated product quality control. In total, the mission data consists of data products of several hundred distinct types from raw to level 2 product types and auxiliary data. The systematic production for Swarm within the ESA Archiving and Payload Data Facility (APDF) is performed up to level 2. The production up to L2 (CAT2-mature algorithm) is performed completely within the APDF. A separate systematic production chain from L1B to L2 (CAT1-evolving algorithm) is performed by an external facility (L2PS) with output files archived within the APDF as well. The APDF also performs re-processing exercises. Re-processing may start directly from the acquired data or from any other intermediate level resulting in the need for a refined product version and baseline management. Storage, dissemination and circulation functionality is configurable in the ESA generic multi-mission elements and does not require any software coding. The control of the production is more involved. While the interface towards the algorithmic entities is standardized due to the introduction of a generic IPF interface by ESA, the orchestration of the individual IPFs into the overall workflows is distinctly mission-specific and not as amenable to standardization. The ESA MMFI production management system provides extension points to integrate additional logical elements for the build-up of complex orchestrated workflows. These extension points have been used to inject the Swarm-specific production logic into the system. A noteworthy fact about the APDF is that the dissemination elements are hosted in a high bandwidth infrastructure procured as a managed service, thus affording users a considerable access bandwidth. This paper gives an overview of the Swarm APDF data flows. It describes the elements of the solution with particular focus on how the available generic multi-mission functionality of the ESA MMFI was utilized and where there was a need to implement missionspecific extensions and plug-ins. The paper concludes with some statistics on the system output during commissioning and early operational phases as well as some general considerations on the utilization of a framework like the ESA MMFI, discussing benefits and pitfalls of the approach.

  6. Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.; Lombardo, Federico

    2018-01-01

    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as 'deterministic components' or 'trends' even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures.

  7. N7 logic via patterning using templated DSA: implementation aspects

    NASA Astrophysics Data System (ADS)

    Bekaert, J.; Doise, J.; Gronheid, R.; Ryckaert, J.; Vandenberghe, G.; Fenger, G.; Her, Y. J.; Cao, Y.

    2015-07-01

    In recent years, major advancements have been made in the directed self-assembly (DSA) of block copolymers (BCP). Insertion of DSA for IC fabrication is seriously considered for the 7 nm node. At this node the DSA technology could alleviate costs for multiple patterning and limit the number of masks that would be required per layer. At imec, multiple approaches for inserting DSA into the 7 nm node are considered. One of the most straightforward approaches for implementation would be for via patterning through templated DSA; a grapho-epitaxy flow using cylindrical phase BCP material resulting in contact hole multiplication within a litho-defined pre-pattern. To be implemented for 7 nm node via patterning, not only the appropriate process flow needs to be available, but also DSA-aware mask decomposition is required. In this paper, several aspects of the imec approach for implementing templated DSA will be discussed, including experimental demonstration of density effect mitigation, DSA hole pattern transfer and double DSA patterning, creation of a compact DSA model. Using an actual 7 nm node logic layout, we derive DSA-friendly design rules in a logical way from a lithographer's view point. A concrete assessment is provided on how DSA-friendly design could potentially reduce the number of Via masks for a place-and-routed N7 logic pattern.

  8. Fuzzy Logic based Handoff Latency Reduction Mechanism in Layer 2 of Heterogeneous Mobile IPv6 Networks

    NASA Astrophysics Data System (ADS)

    Anwar, Farhat; Masud, Mosharrof H.; Latif, Suhaimi A.

    2013-12-01

    Mobile IPv6 (MIPv6) is one of the pioneer standards that support mobility in IPv6 environment. It has been designed to support different types of technologies for providing seamless communications in next generation network. However, MIPv6 and subsequent standards have some limitations due to its handoff latency. In this paper, a fuzzy logic based mechanism is proposed to reduce the handoff latency of MIPv6 for Layer 2 (L2) by scanning the Access Points (APs) while the Mobile Node (MN) is moving among different APs. Handoff latency occurs when the MN switches from one AP to another in L2. Heterogeneous network is considered in this research in order to reduce the delays in L2. Received Signal Strength Indicator (RSSI) and velocity of the MN are considered as the input of fuzzy logic technique. This technique helps the MN to measure optimum signal quality from APs for the speedy mobile node based on fuzzy logic input rules and makes a list of interfaces. A suitable interface from the list of available interfaces can be selected like WiFi, WiMAX or GSM. Simulation results show 55% handoff latency reduction and 50% packet loss improvement in L2 compared to standard to MIPv6.

  9. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    PubMed Central

    Lo, Benjamin W. Y.; Macdonald, R. Loch; Baker, Andrew; Levine, Mitchell A. H.

    2013-01-01

    Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH). Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients). Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs). Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique) denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication. PMID:23690884

  10. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  11. Competing Logics and Healthcare Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Saks, Mike

    2017-08-20

    This paper offers a short commentary on the editorial by Mannion and Exworthy. The paper highlights the positive insights offered by their analysis into the tensions between the competing institutional logics of standardization and customization in healthcare, in part manifested in the conflict between managers and professionals, and endorses the plea of the authors for further research in this field. However, the editorial is criticized for its lack of a strong societal reference point, the comparative absence of focus on hybridization, and its failure to highlight structural factors impinging on the opposing logics in a broader neo-institutional framework. With reference to the Procrustean metaphor, it is argued that greater stress should be placed on the healthcare user in future health policy. Finally, the case of complementary and alternative medicine is set out which - while not explicitly mentioned in the editorial - most effectively concretizes the tensions at the heart of this analysis of healthcare. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  12. A Manual for Evaluating School Facilities.

    ERIC Educational Resources Information Center

    Reida, G.W.

    This survey manual evaluates the important points of functionality of school facilities in logical order. Instructions are given for the use of the manual, and separate sections present guidelines for evaluation of the following--(1) site, (2) building structure, (3) administrative spaces, (4) classrooms, (5) special rooms, (6) general service…

  13. (E)pistemological Awareness, Instantiation of Methods, and Uninformed Methodological Ambiguity in Qualitative Research Projects

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Yendol-Hoppey, Diane; Smith, Jason Jude; Hayes, Sharon B.

    2009-01-01

    This article explores epistemological awareness and instantiation of methods, as well as uninformed ambiguity, in qualitative methodological decision making and research reporting. The authors argue that efforts should be made to make the research process, epistemologies, values, methodological decision points, and argumentative logic open,…

  14. The Potential of Statement-Posing Tasks

    ERIC Educational Resources Information Center

    Yang, Kai-Lin

    2010-01-01

    This communication aims at revealing the potential of statement-posing tasks to facilitate students' thinking and strategies of understanding proof. Besides outlining the background of statement-posing tasks, four points were advanced as potential benefits of the tasks: (1) focusing on the logic of arguments in addition to the meaning of…

  15. The Programmable Calculator in the Classroom.

    ERIC Educational Resources Information Center

    Stolarz, Theodore J.

    The uses of programable calculators in the mathematics classroom are presented. A discussion of the "microelectronics revolution" that has brought programable calculators into our society is also included. Pointed out is that the logical or mental processes used to program the programable calculator are identical to those used to program…

  16. Academic Discourses on School-Based Teacher Collaboration: Revisiting the Arguments

    ERIC Educational Resources Information Center

    Lavie, Jose Manuel

    2006-01-01

    Purpose: After decades arguing the necessity of transforming schools into collaborative workplaces, teacher collaboration has been taken up by various discursive logics offering different viewpoints of the concept. This article reviews some of these discourses and looks at their main arguments, pointing to the contradictions and tensions between…

  17. Intellectual College Development Related to Alumni Perceptions of Personal Growth

    ERIC Educational Resources Information Center

    Erwin, T. Dary

    2012-01-01

    Alumni self-ratings of their personal growth were linked to their intellectual development during college four to seven years earlier. Graduates that were satisfied with their personal growth in the arts, creative thinking, making logical inferences, learning independently, exercising initiative, and tolerating other points of view had higher…

  18. Hilbert's axiomatic method and Carnap's general axiomatics.

    PubMed

    Stöltzner, Michael

    2015-10-01

    This paper compares the axiomatic method of David Hilbert and his school with Rudolf Carnap's general axiomatics that was developed in the late 1920s, and that influenced his understanding of logic of science throughout the 1930s, when his logical pluralism developed. The distinct perspectives become visible most clearly in how Richard Baldus, along the lines of Hilbert, and Carnap and Friedrich Bachmann analyzed the axiom system of Hilbert's Foundations of Geometry—the paradigmatic example for the axiomatization of science. Whereas Hilbert's axiomatic method started from a local analysis of individual axiom systems in which the foundations of mathematics as a whole entered only when establishing the system's consistency, Carnap and his Vienna Circle colleague Hans Hahn instead advocated a global analysis of axiom systems in general. A primary goal was to evade, or formalize ex post, mathematicians' 'material' talk about axiom systems for such talk was held to be error-prone and susceptible to metaphysics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs

    NASA Astrophysics Data System (ADS)

    Purba, J. H.

    2018-02-01

    Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.

  20. Vapor cycle energy system for implantable circulatory assist devices. Final summary May--Oct 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watelet, R.P.; Ruggles, A.E.; Hagen, K.G.

    1977-03-01

    The report describes the development status of a heart assist system driven by a nuclear-fueled, electronically controlled vapor cycle engine termed the tidal regenerator engine (TRE). The TRE pressurization is controlled by a torque motor coupled to a displacer. The electrical power for the sensor, electronic logic and actuator is provided by thermoelectric modules interposed between the engine superheater and boiler. The TRE is direct-coupled to an assist blood pump which also acts as a blood-cooled heat exchanger, pressure-volume trasformer and sensor for the electronic logic. Engine cycle efficiency in excess of 14% has been demonstrated routinely. Overall system efficiencymore » on 33 watts of over 9% has been demonstrated (implied 13% engine cycle efficiency). A binary version of this engine in the annular configuration is now being tested. The preliminary tests demonstrated 10% cycle efficiency on the first buildup which ran well and started easily.« less

  1. Stress in dilute suspensions

    NASA Technical Reports Server (NTRS)

    Passman, Stephen L.

    1989-01-01

    Generally, two types of theory are used to describe the field equations for suspensions. The so-called postulated equations are based on the kinetic theory of mixtures, which logically should give reasonable equations for solutions. The basis for the use of such theory for suspensions is tenuous, though it at least gives a logical path for mathematical arguments. It has the disadvantage that it leads to a system of equations which is underdetermined, in a sense that can be made precise. On the other hand, the so-called averaging theory starts with a determined system, but the very process of averaging renders the resulting system underdetermined. A third type of theory is proposed in which the kinetic theory of gases is used to motivate continuum equations for the suspended particles. This entails an interpretation of the stress in the particles that is different from the usual one. Classical theory is used to describe the motion of the suspending medium. The result is a determined system for a dilute suspension. Extension of the theory to more concentrated systems is discussed.

  2. Identifying Environmental and Social Factors Predisposing to Pathological Gambling Combining Standard Logistic Regression and Logic Learning Machine.

    PubMed

    Parodi, Stefano; Dosi, Corrado; Zambon, Antonella; Ferrari, Enrico; Muselli, Marco

    2017-12-01

    Identifying potential risk factors for problem gambling (PG) is of primary importance for planning preventive and therapeutic interventions. We illustrate a new approach based on the combination of standard logistic regression and an innovative method of supervised data mining (Logic Learning Machine or LLM). Data were taken from a pilot cross-sectional study to identify subjects with PG behaviour, assessed by two internationally validated scales (SOGS and Lie/Bet). Information was obtained from 251 gamblers recruited in six betting establishments. Data on socio-demographic characteristics, lifestyle and cognitive-related factors, and type, place and frequency of preferred gambling were obtained by a self-administered questionnaire. The following variables associated with PG were identified: instant gratification games, alcohol abuse, cognitive distortion, illegal behaviours and having started gambling with a relative or a friend. Furthermore, the combination of LLM and LR indicated the presence of two different types of PG, namely: (a) daily gamblers, more prone to illegal behaviour, with poor money management skills and who started gambling at an early age, and (b) non-daily gamblers, characterised by superstitious beliefs and a higher preference for immediate reward games. Finally, instant gratification games were strongly associated with the number of games usually played. Studies on gamblers habitually frequently betting shops are rare. The finding of different types of PG by habitual gamblers deserves further analysis in larger studies. Advanced data mining algorithms, like LLM, are powerful tools and potentially useful in identifying risk factors for PG.

  3. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  4. Thou shalt not take sides: Cognition, Logic and the need for changing how we believe

    NASA Astrophysics Data System (ADS)

    Martins, Andre

    2016-03-01

    We believe in many different ways. One very common one is by supporting ideas we like. We label them correct and we act to dismiss doubts about them. We take sides about ideas and theories as if that was the right thing to do. And yet, from a rational point of view, this type of support and belief is not justifiable. The best we can hope when describing the real world, as far as we know today, is to have probabilistic knowledge. In practice, estimating a real probability can be too hard to achieve but that just means we have more uncertainty, not less. There are ideas we defend that define, in our minds, our own identity. And recent experiments have been showing that we stop being able to analyze competently those propositions we hold so dearly. In this paper, I gather the evidence we have about taking sides and present the obvious but unseen conclusion that these facts combined mean that we should actually never believe in anything about the real world, except in a probabilistic way. We must actually never take sides since taking sides compromise our abilities to seek for the most correct description of the world. That means we need to start reformulating the way we debate ideas, from our teaching to our political debates. Here, I will show the logical and experimental basis of this conclusion. I will also show, by presenting new models for the evolution of opinions, that our desire to have something to believe is probably behind the emergence of extremism in debates. And we will see how this problem can even have an impact in the reliability of whole scientific fields. The crisis around p-values is discussed and much better understood under the light of this paper results. Finally, I will debate possible consequences and ideas on how to deal with this problem.

  5. Meeting the Deadline: Why, When and How

    NASA Technical Reports Server (NTRS)

    Dignum, Frank; Broersen, Jan; Dignum, Virginia; Meyer, John-Jules

    2004-01-01

    A normative system is defined as any set of interacting agents whose behavior can usefully be regarded as norm-directed. Most organizations, and more specifically institutions, fall under this definition. Interactions in these normative systems are regulated by normative templates that describe desired behavior in terms of deontic concepts (obligations, prohibitions and permissions), deadlines, violations and sanctions. Agreements between agents, and between an agent and the society, can then be specified by means of contracts. Contracts provide flexible but verifiable means to integrate society requirements and agent autonomy. and are an adequate means for the explicit specification of interactions. From the society perspective, it is important that these contracts adhere to the specifications described in the model of the organization. If we want to automate such verifications, we have to formalize the languages used for contracts and for the specification of organizations. The logic LCR is based on deontic temporal logic. LCR is an expressive language for describing interaction in multi-agent systems, including obligations with deadlines. Deadlines are important norms in most interactions between agents. Intuitively, a deadline states that an agent should perform an action before a certain point in time. The obligation to perform the action starts at the moment the deadline becomes active. E.g. when a contract is signed or approved. If the action is not performed in time a violation of the deadline occurs. It can be specified independently what measure has to be taken in this case. In this paper we investigate the deadline concept in more detail. The paper is organized as follows. Section 2 defines the variant of CTL we use. In section 3, we discuss the basic intuitions of deadlines. Section 4 presents a first intuitive formalization for deadlines. In section 5, we look at a more complex model for deadlines trying to catch some more practical aspects. Finally, in section 6 we present issues for future work and our conciusions.

  6. Logical optical line terminal technologies towards flexible and highly reliable metro- and access-integrated networks

    NASA Astrophysics Data System (ADS)

    Okamoto, Satoru; Sato, Takehiro; Yamanaka, Naoaki

    2017-01-01

    In this paper, flexible and highly reliable metro and access integrated networks with network virtualization and software defined networking technologies will be presented. Logical optical line terminal (L-OLT) technologies and active optical distribution networks (ODNs) are the key to introduce flexibility and high reliability into the metro and access integrated networks. In the Elastic Lambda Aggregation Network (EλAN) project which was started in 2012, a concept of the programmable optical line terminal (P-OLT) has been proposed. A role of the P-OLT is providing multiple network services that have different protocols and quality of service requirements by single OLT box. Accommodated services will be Internet access, mobile front-haul/back-haul, data-center access, and leased line. L-OLTs are configured within the P-OLT box to support the functions required for each network service. Multiple P-OLTs and programmable optical network units (P-ONUs) are connected by the active ODN. Optical access paths which have flexible capacity are set on the ODN to provide network services from L-OLT to logical ONUs (L-ONUs). The L-OLT to L-ONU path on the active ODN provides a logical connection. Therefore, introducing virtualization technologies becomes possible. One example is moving an L-OLT from one P-OLT to another P-OLT like a virtual machine. This movement is called L-OLT migration. The L-OLT migration provides flexible and reliable network functions such as energy saving by aggregating L-OLTs to a limited number of P-OLTs, and network wide optical access path restoration. Other L-OLT virtualization technologies and experimental results will be also discussed in the paper.

  7. Stock and option portfolio using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Sumarti, Novriana; Wahyudi, Nanang

    2014-03-01

    Fuzzy Logic in decision-making process has been widely implemented in various problems in industries. It is the theory of imprecision and uncertainty that was not based on probability theory. Fuzzy Logic adds values of degree between absolute true and absolute false. It starts with and builds on a set of human language rules supplied by the user. The fuzzy systems convert these rules to their mathematical equivalents. This could simplify the job of the system designer and the computer, and results in much more accurate representations of the way systems behave in the real world. In this paper we examine the decision making process of stock and option trading by the usage of MACD (Moving Average Convergence Divergence) technical analysis and Option Pricing with Fuzzy Logic approach. MACD technical analysis is for the prediction of the trends of underlying stock prices, such as bearish (going downward), bullish (going upward), and sideways. By using Fuzzy C-Means technique and Mamdani Fuzzy Inference System, we define the decision output where the value of MACD is high then decision is "Strong Sell", and the value of MACD is Low then the decision is "Strong Buy". We also implement the fuzzification of the Black-Scholes option-pricing formula. The stock and options methods are implemented on a portfolio of one stock and its options. Even though the values of input data, such as interest rates, stock price and its volatility, cannot be obtain accurately, these fuzzy methods can give a belief degree of the calculated the Black-Scholes formula so we can make the decision on option trading. The results show the good capability of the methods in the prediction of stock price trends. The performance of the simulated portfolio for a particular period of time also shows good return.

  8. Memristor-based programmable logic array (PLA) and analysis as Memristive networks.

    PubMed

    Lee, Kwan-Hee; Lee, Sang-Jin; Kim, Seok-Man; Cho, Kyoungrok

    2013-05-01

    A Memristor theorized by Chua in 1971 has the potential to dramatically influence the way electronic circuits are designed. It is a two terminal device whose resistance state is based on the history of charge flow brought about as the result of the voltage being applied across its terminals and hence can be thought of as a special case of a reconfigurable resistor. Nanoscale devices using dense and regular fabrics such as Memristor cross-bar is promising new architecture for System-on-Chip (SoC) implementations in terms of not only the integration density that the technology can offer but also both improved performance and reduced power dissipation. Memristor has the capacity to switch between high and low resistance states in a cross-bar circuit configuration. The cross-bars are formed from an array of vertical conductive nano-wires cross a second array of horizontal conductive wires. Memristors are realized at the intersection of the two wires in the array through appropriate processing technology such that any particular wire in the vertical array can be connected to a wire in the horizontal array by switching the resistance of a particular intersection to a low state while other cross-points remain in a high resistance state. However the approach introduces a number of challenges. The lack of voltage gain prevents logic being cascaded and voltage level degradation affects robustness of the operation. Moreover the cross-bars introduce sneak current paths when two or more cross points are connected through the switched Memristor. In this paper, we propose Memristor-based programmable logic array (PLA) architecture and develop an analytical model to analyze the logic level on the memristive networks. The proposed PLA architecture has 12 inputs maximum and can be cascaded for more input variables with R(off)/R(on) ratio in the range from 55 to 160 of Memristors.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanchurin, Vitaly, E-mail: vvanchur@d.umn.edu

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly,more » CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.« less

  10. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  11. A method of inferring collision ratio based on maneuverability of own ship under critical collision conditions

    NASA Astrophysics Data System (ADS)

    You, Youngjun; Rhee, Key-Pyo; Ahn, Kyoungsoo

    2013-06-01

    In constructing a collision avoidance system, it is important to determine the time for starting collision avoidance maneuver. Many researchers have attempted to formulate various indices by applying a range of techniques. Among these indices, collision risk obtained by combining Distance to the Closest Point of Approach (DCPA) and Time to the Closest Point of Approach (TCPA) information with fuzzy theory is mostly used. However, the collision risk has a limit, in that membership functions of DCPA and TCPA are empirically determined. In addition, the collision risk is not able to consider several critical collision conditions where the target ship fails to take appropriate actions. It is therefore necessary to design a new concept based on logical approaches. In this paper, a collision ratio is proposed, which is the expected ratio of unavoidable paths to total paths under suitably characterized operation conditions. Total paths are determined by considering categories such as action space and methodology of avoidance. The International Regulations for Preventing Collisions at Sea (1972) and collision avoidance rules (2001) are considered to solve the slower ship's dilemma. Different methods which are based on a constant speed model and simulated speed model are used to calculate the relative positions between own ship and target ship. In the simulated speed model, fuzzy control is applied to determination of command rudder angle. At various encounter situations, the time histories of the collision ratio based on the simulated speed model are compared with those based on the constant speed model.

  12. 34 CFR 200.16 - Starting points.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Using data from the 2001-2002 school year, each State must establish starting points in reading/language... of proficient students in the school that represents 20 percent of the State's total enrollment among all schools ranked by the percentage of students at the proficient level. The State must determine...

  13. Starting geometry creation and design method for freeform optics.

    PubMed

    Bauer, Aaron; Schiesser, Eric M; Rolland, Jannick P

    2018-05-01

    We describe a method for designing freeform optics based on the aberration theory of freeform surfaces that guides the development of a taxonomy of starting-point geometries with an emphasis on manufacturability. An unconventional approach to the optimization of these starting designs wherein the rotationally invariant 3rd-order aberrations are left uncorrected prior to unobscuring the system is shown to be effective. The optimal starting-point geometry is created for an F/3, 200 mm aperture-class three-mirror imager and is fully optimized using a novel step-by-step method over a 4 × 4 degree field-of-view to exemplify the design method. We then optimize an alternative starting-point geometry that is common in the literature but was quantified here as a sub-optimal candidate for optimization with freeform surfaces. A comparison of the optimized geometries shows the performance of the optimal geometry is at least 16× better, which underscores the importance of the geometry when designing freeform optics.

  14. Good and Bad Public Prose.

    ERIC Educational Resources Information Center

    Cockburn, Stewart

    1969-01-01

    The basic requirements of all good prose are clarity, accuracy, brevity, and simplicity. Especially in public prose--in which the meaning is the crux of the article or speech--concise, vigorous English demands a minimum of adjectives, a maximum use of the active voice, nouns carefully chosen, a logical argument with no labored or obscure points,…

  15. Religion, Liberalism and Education: A Response to Roger Trigg

    ERIC Educational Resources Information Center

    Carr, David

    2008-01-01

    Although he shares many of Professor Roger Trigg's views about the logical character and human significance of religion and religious discourse, including the view that religious claims are matters for rational understanding and appraisal, the author expresses difficulties with key points in Trigg's diagnosis and critique of what he takes to be…

  16. Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach

    ERIC Educational Resources Information Center

    Stevenson, Glenn A.

    2012-01-01

    For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…

  17. Wilderness Crisis Management. Explore Magazine Technical Series No. 11.

    ERIC Educational Resources Information Center

    Raffan, James

    This paper deals with managing a crisis in a wilderness situation. The terms "crisis" and "turning point" are used to describe what is more traditionally called an accident. Using these terms introduces the idea that crisis events occur as logical consequences of preceding decisions, errors, or omissions, not as the result of…

  18. Educational Imperatives of the Evolution of Consciousness: The Integral Visions of Rudolf Steiner and Ken Wilber

    ERIC Educational Resources Information Center

    Gidley, Jennifer M.

    2007-01-01

    Rudolf Steiner and Ken Wilber claim that human consciousness is evolving beyond the "formal", abstract, intellectual mode toward a "post-formal", integral mode. Wilber calls this "vision-logic" and Steiner calls it "consciousness/spiritual soul". Both point to the emergence of more complex, dialectical,…

  19. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work.

  20. Control Strategies for Smoothing of Output Power of Wind Energy Conversion Systems

    NASA Astrophysics Data System (ADS)

    Pratap, Alok; Urasaki, Naomitsu; Senju, Tomonobu

    2013-10-01

    This article presents a control method for output power smoothing of a wind energy conversion system (WECS) with a permanent magnet synchronous generator (PMSG) using the inertia of wind turbine and the pitch control. The WECS used in this article adopts an AC-DC-AC converter system. The generator-side converter controls the torque of the PMSG, while the grid-side inverter controls the DC-link and grid voltages. For the generator-side converter, the torque command is determined by using the fuzzy logic. The inputs of the fuzzy logic are the operating point of the rotational speed of the PMSG and the difference between the wind turbine torque and the generator torque. By means of the proposed method, the generator torque is smoothed, and the kinetic energy stored by the inertia of the wind turbine can be utilized to smooth the output power fluctuations of the PMSG. In addition, the wind turbines shaft stress is mitigated compared to a conventional maximum power point tracking control. Effectiveness of the proposed method is verified by the numerical simulations.

  1. PLL Based Energy Efficient PV System with Fuzzy Logic Based Power Tracker for Smart Grid Applications.

    PubMed

    Rohini, G; Jamuna, V

    This work aims at improving the dynamic performance of the available photovoltaic (PV) system and maximizing the power obtained from it by the use of cascaded converters with intelligent control techniques. Fuzzy logic based maximum power point technique is embedded on the first conversion stage to obtain the maximum power from the available PV array. The cascading of second converter is needed to maintain the terminal voltage at grid potential. The soft-switching region of three-stage converter is increased with the proposed phase-locked loop based control strategy. The proposed strategy leads to reduction in the ripple content, rating of components, and switching losses. The PV array is mathematically modeled and the system is simulated and the results are analyzed. The performance of the system is compared with the existing maximum power point tracking algorithms. The authors have endeavored to accomplish maximum power and improved reliability for the same insolation of the PV system. Hardware results of the system are also discussed to prove the validity of the simulation results.

  2. PLL Based Energy Efficient PV System with Fuzzy Logic Based Power Tracker for Smart Grid Applications

    PubMed Central

    Rohini, G.; Jamuna, V.

    2016-01-01

    This work aims at improving the dynamic performance of the available photovoltaic (PV) system and maximizing the power obtained from it by the use of cascaded converters with intelligent control techniques. Fuzzy logic based maximum power point technique is embedded on the first conversion stage to obtain the maximum power from the available PV array. The cascading of second converter is needed to maintain the terminal voltage at grid potential. The soft-switching region of three-stage converter is increased with the proposed phase-locked loop based control strategy. The proposed strategy leads to reduction in the ripple content, rating of components, and switching losses. The PV array is mathematically modeled and the system is simulated and the results are analyzed. The performance of the system is compared with the existing maximum power point tracking algorithms. The authors have endeavored to accomplish maximum power and improved reliability for the same insolation of the PV system. Hardware results of the system are also discussed to prove the validity of the simulation results. PMID:27294189

  3. Application of fuzzy logic to the control of wind tunnel settling chamber temperature

    NASA Technical Reports Server (NTRS)

    Gwaltney, David A.; Humphreys, Gregory L.

    1994-01-01

    The application of Fuzzy Logic Controllers (FLC's) to the control of nonlinear processes, typically controlled by a human operator, is a topic of much study. Recent application of a microprocessor-based FLC to the control of temperature processes in several wind tunnels has proven to be very successful. The control of temperature processes in the wind tunnels requires the ability to monitor temperature feedback from several points and to accommodate varying operating conditions in the wind tunnels. The FLC has an intuitive and easily configurable structure which incorporates the flexibility required to have such an ability. The design and implementation of the FLC is presented along with process data from the wind tunnels under automatic control.

  4. Programmable DNA switches and their applications.

    PubMed

    Harroun, Scott G; Prévost-Tremblay, Carl; Lauzon, Dominic; Desrosiers, Arnaud; Wang, Xiaomeng; Pedro, Liliana; Vallée-Bélisle, Alexis

    2018-03-08

    DNA switches are ideally suited for numerous nanotechnological applications, and increasing efforts are being directed toward their engineering. In this review, we discuss how to engineer these switches starting from the selection of a specific DNA-based recognition element, to its adaptation and optimisation into a switch, with applications ranging from sensing to drug delivery, smart materials, molecular transporters, logic gates and others. We provide many examples showcasing their high programmability and recent advances towards their real life applications. We conclude with a short perspective on this exciting emerging field.

  5. Approaches to Plant Hydrogen and Oxygen Isoscapes Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, Jason B.; Kreuzer-Martin, Helen W.; Ehleringer, James

    2009-12-01

    Plant hydrogen and oxygen isoscapes have been utilized to address important and somewhat disparate research goals. The isotopic composition of leaf water affects the isotopic composition of atmospheric CO2 and O2 and is a logical starting point for understanding the isotopic composition of plant organic compounds since photosynthesis occurs in the leaf water environment. Leaf water isoscapes have been produced largely as part of efforts to understand atmospheric gas isotopic composition. The isotopic composition of plant organic matter has also been targeted for its potential to serve as a proxy for past environmental conditions. Spatially distributed sampling and modeling ofmore » modern plant H & O isoscapes can improve our understanding of the controls of the isotope ratios of compounds such as cellulose or n-alkanes from plants and therefore their utility for paleoreconstructions. Spatially varying plant hydrogen and oxygen isotopes have promise for yielding geographic origin information for a variety of plant products, including objects of criminal forensic interest or food products. The future has rich opportunities for the continued development of mechanistic models, methodologies for the generation of hydrogen and oxygen isoscapes, and cross-disciplinary interactions as these tools for understanding are developed, shared, and utilized to answer large-scale questions.« less

  6. The case for multimodal analysis of atypical interaction: questions, answers and gaze in play involving a child with autism.

    PubMed

    Muskett, Tom; Body, Richard

    2013-01-01

    Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.

  7. Distribution of immunodeficiency fact files with XML--from Web to WAP.

    PubMed

    Väliaho, Jouni; Riikonen, Pentti; Vihinen, Mauno

    2005-06-26

    Although biomedical information is growing rapidly, it is difficult to find and retrieve validated data especially for rare hereditary diseases. There is an increased need for services capable of integrating and validating information as well as proving it in a logically organized structure. A XML-based language enables creation of open source databases for storage, maintenance and delivery for different platforms. Here we present a new data model called fact file and an XML-based specification Inherited Disease Markup Language (IDML), that were developed to facilitate disease information integration, storage and exchange. The data model was applied to primary immunodeficiencies, but it can be used for any hereditary disease. Fact files integrate biomedical, genetic and clinical information related to hereditary diseases. IDML and fact files were used to build a comprehensive Web and WAP accessible knowledge base ImmunoDeficiency Resource (IDR) available at http://bioinf.uta.fi/idr/. A fact file is a user oriented user interface, which serves as a starting point to explore information on hereditary diseases. The IDML enables the seamless integration and presentation of genetic and disease information resources in the Internet. IDML can be used to build information services for all kinds of inherited diseases. The open source specification and related programs are available at http://bioinf.uta.fi/idml/.

  8. Extraction Optimization for Obtaining Artemisia capillaris Extract with High Anti-Inflammatory Activity in RAW 264.7 Macrophage Cells

    PubMed Central

    Jang, Mi; Jeong, Seung-Weon; Kim, Bum-Keun; Kim, Jong-Chan

    2015-01-01

    Plant extracts have been used as herbal medicines to treat a wide variety of human diseases. We used response surface methodology (RSM) to optimize the Artemisia capillaris Thunb. extraction parameters (extraction temperature, extraction time, and ethanol concentration) for obtaining an extract with high anti-inflammatory activity at the cellular level. The optimum ranges for the extraction parameters were predicted by superimposing 4-dimensional response surface plots of the lipopolysaccharide- (LPS-) induced PGE2 and NO production and by cytotoxicity of A. capillaris Thunb. extracts. The ranges of extraction conditions used for determining the optimal conditions were extraction temperatures of 57–65°C, ethanol concentrations of 45–57%, and extraction times of 5.5–6.8 h. On the basis of the results, a model with a central composite design was considered to be accurate and reliable for predicting the anti-inflammation activity of extracts at the cellular level. These approaches can provide a logical starting point for developing novel anti-inflammatory substances from natural products and will be helpful for the full utilization of A. capillaris Thunb. The crude extract obtained can be used in some A. capillaris Thunb.-related health care products. PMID:26075271

  9. Informational analysis involving application of complex information system

    NASA Astrophysics Data System (ADS)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  10. A logical starting point for developing priorities for lizard and snake ecotoxicology: a review of available data.

    PubMed

    Campbell, Kym Rouse; Campbell, Todd S

    2002-05-01

    Reptiles, specifically lizards and snakes, usually are excluded from environmental contamination studies and ecological risk assessments. This brief summary of available lizard and snake environmental contaminant data is presented to assist in the development of priorities for lizard and snake ecotoxicology. Most contaminant studies were not conducted recently, list animals found dead or dying after pesticide application, report residue concentrations after pesticide exposure, compare contaminant concentrations in animals from different areas, compare residue concentrations found in different tissues and organs, or compare changes in concentrations over time. The biological significance of the contaminant concentrations is rarely studied. A few recent studies, especially those conducted on modern pesticides, link the contaminant effects with exposure concentrations. Nondestructive sampling techniques for determining organic and inorganic contaminant concentrations in lizards and snakes recently have been developed. Studies that relate exposure, concentration, and effects of all types of environmental contaminants on lizards and snakes are needed. Because most lizards eat insects, studies on the exposure, effects, and accumulation of insecticides in lizards, and their predators, should be a top priority. Because all snakes are upper-trophic-level carnivores, studies on the accumulation and effects of contaminants that are known to bioaccumulate or biomagnify up the food chain should be the top priority.

  11. A Mnemonic for the Inositols

    NASA Astrophysics Data System (ADS)

    Painter, Terence J.

    1996-10-01

    The mnemonic derives from the mythical tale of Scylla and Charybdis in Homer's Odyssey (chapter 12). It takes the form of an imaginary headline in a newspaper: SCYLLA MEETS CHARYBDIS - EPIC NEWS MUCH ALARMS SICILY. The first two or three letters in each of these eight words remind the user that the nine configurational prefixes are scyllo-, meso-, (or myo-), chiro- [(+) and (-)], epi-, neo-, muco-, allo-, and cis-, respectively. The mnemonic also arranges the prefixes in an order that allows the configurations to be derived in a logical manner by performing a defined sequence of imaginary configurational inversions (epimerizations) around a cyclohexane ring. The all-equatorial, chair conformation of scyllo-inositol is selected as the starting point, and the sequence of inversions is defined by a systematic permutation of possibilities for performing one, two or three inversions in succession (1; 1 and 2; 1 and 3; 1 and 4; 1, 2 and 3; 1, 2 and 4; and finally 1, 3 and 5). In the case of the two chiro-inositols, the enantiomeric form is determined simply by the direction (clockwise or counterclockwise) around the ring in which the imaginary inversions are performed. This also applies formally to allo-inositol, but in that case the two optical enantiomers are isoenergetic chair conformers in rapid equilibrium.

  12. Influence of Femoral Component Design on Retrograde Femoral Nail Starting Point.

    PubMed

    Service, Benjamin C; Kang, William; Turnbull, Nathan; Langford, Joshua; Haidukewych, George; Koval, Kenneth J

    2015-10-01

    Our experience with retrograde femoral nailing after periprosthetic distal femur fractures was that femoral components with deep trochlear grooves posteriorly displace the nail entry point resulting in recurvatum deformity. This study evaluated the influence of distal femoral prosthetic design on the starting point. One hundred lateral knee images were examined. The distal edge of Blumensaat's line was used to create a ratio of its location compared with the maximum anteroposterior condylar width called the starting point ratio (SPR). Femoral trials from 6 manufacturers were analyzed to determine the location of simulated nail position in the sagittal plane compared with the maximum anteroposterior prosthetic width. These measurements were used to create a ratio, the femoral component ratio (FCR). The FCR was compared with the SPR to determine if a femoral component would be at risk for retrograde nail starting point posterior to the Blumensaat's line. The mean SPR was 0.392 ± 0.03, and the mean FCR was 0.416 ± 0.05, which was significantly greater (P = 0.003). The mean FCR was 0.444 ± 0.06 for the cruciate retaining (CR) trials and was 0.393 ± 0.04 for the posterior stabilized trials; this difference was significant (P < 0.001). The FCR for the femoral trials studied was significantly greater than the SPR for native knees and was significantly greater for CR femoral components compared with posterior stabilized components. These findings demonstrate that many total knee prostheses, particularly CR designs, are at risk for a starting point posterior to Blumensaat's line.

  13. Decidable and undecidable arithmetic functions in actin filament networks

    NASA Astrophysics Data System (ADS)

    Schumann, Andrew

    2018-01-01

    The plasmodium of Physarum polycephalum is very sensitive to its environment, and reacts to stimuli with appropriate motions. Both the sensory and motor stages of these reactions are explained by hydrodynamic processes, based on fluid dynamics, with the participation of actin filament networks. This paper is devoted to actin filament networks as a computational medium. The point is that actin filaments, with contributions from many other proteins like myosin, are sensitive to extracellular stimuli (attractants as well as repellents), and appear and disappear at different places in the cell to change aspects of the cell structure—e.g. its shape. By assembling and disassembling actin filaments, some unicellular organisms, like Amoeba proteus, can move in response to various stimuli. As a result, these organisms can be considered a simple reversible logic gate—extracellular signals being its inputs and motions its outputs. In this way, we can implement various logic gates on amoeboid behaviours. These networks can embody arithmetic functions within p-adic valued logic. Furthermore, within these networks we can define the so-called diagonalization for deducing undecidable arithmetic functions.

  14. [Interprofessional collaboration in the Family Health Strategy: implications for the provision of care and work management].

    PubMed

    Matuda, Caroline Guinoza; Pinto, Nicanor Rodrigues da Silva; Martins, Cleide Lavieri; Frazão, Paulo

    2015-08-01

    Interprofessional collaboration is seen as a resource for tackling model of care and workforce problems. The scope of this study was to understand the perception about the shared work and interprofessional collaboration of professionals who work in primary health care. A qualitative study was conducted in São Paulo city. In-depth interviews were performed with professionals from distinct categories who worked in the Family Health Strategy and Support Center for Family Health. The results highlighted the empirical 'professional interaction' and 'production goals' categories. The forms of interaction, the role of specialized matrix support and the perspective in which production goals are perceived by the professionals pointed to tensions between traditional professional logic and collaboration logic. It also revealed the tensions between a model based on specialized procedures and a more collaborative model centered on health needs of families and of the community. The sharing of responsibilities and practices, changes in the logic of patient referral to specialized services and inadequate organizational arrangements remain major challenges to the integration of interprofessional collaboration for the development of new care practices.

  15. The structure, logic of operation and distinctive features of the system of triggers and counting signals formation for gamma-telescope GAMMA-400

    NASA Astrophysics Data System (ADS)

    Topchiev, N. P.; Galper, A. M.; Arkhangelskiy, A. I.; Arkhangelskaja, I. V.; Kheymits, M. D.; Suchkov, S. I.; Yurkin, Y. T.

    2017-01-01

    Scientific project GAMMA-400 (Gamma Astronomical Multifunctional Modular Apparatus) relates to the new generation of space observatories intended to perform an indirect search for signatures of dark matter in the cosmic-ray fluxes, measurements of characteristics of diffuse gamma-ray emission and gamma-rays from the Sun during periods of solar activity, gamma-ray bursts, extended and point gamma-ray sources, electron/positron and cosmic-ray nuclei fluxes up to TeV energy region by means of the GAMMA-400 gamma-ray telescope represents the core of the scientific complex. The system of triggers and counting signals formation of the GAMMA-400 gamma-ray telescope constitutes the pipelined processor structure which collects data from the gamma-ray telescope subsystems and produces summary information used in forming the trigger decision for each event. The system design is based on the use of state-of-the-art reconfigurable logic devices and fast data links. The basic structure, logic of operation and distinctive features of the system are presented.

  16. Representation of molecular structure using quantum topology with inductive logic programming in structure-activity relationships.

    PubMed

    Buttingsrud, Bård; Ryeng, Einar; King, Ross D; Alsberg, Bjørn K

    2006-06-01

    The requirement of aligning each individual molecule in a data set severely limits the type of molecules which can be analysed with traditional structure activity relationship (SAR) methods. A method which solves this problem by using relations between objects is inductive logic programming (ILP). Another advantage of this methodology is its ability to include background knowledge as 1st-order logic. However, previous molecular ILP representations have not been effective in describing the electronic structure of molecules. We present a more unified and comprehensive representation based on Richard Bader's quantum topological atoms in molecules (AIM) theory where critical points in the electron density are connected through a network. AIM theory provides a wealth of chemical information about individual atoms and their bond connections enabling a more flexible and chemically relevant representation. To obtain even more relevant rules with higher coverage, we apply manual postprocessing and interpretation of ILP rules. We have tested the usefulness of the new representation in SAR modelling on classifying compounds of low/high mutagenicity and on a set of factor Xa inhibitors of high and low affinity.

  17. ESMPy and OpenClimateGIS: Python Interfaces for High Performance Grid Remapping and Geospatial Dataset Manipulation

    NASA Astrophysics Data System (ADS)

    O'Kuinghttons, Ryan; Koziol, Benjamin; Oehmke, Robert; DeLuca, Cecelia; Theurich, Gerhard; Li, Peggy; Jacob, Joseph

    2016-04-01

    The Earth System Modeling Framework (ESMF) Python interface (ESMPy) supports analysis and visualization in Earth system modeling codes by providing access to a variety of tools for data manipulation. ESMPy started as a Python interface to the ESMF grid remapping package, which provides mature and robust high-performance and scalable grid remapping between 2D and 3D logically rectangular and unstructured grids and sets of unconnected data. ESMPy now also interfaces with OpenClimateGIS (OCGIS), a package that performs subsetting, reformatting, and computational operations on climate datasets. ESMPy exposes a subset of ESMF grid remapping utilities. This includes bilinear, finite element patch recovery, first-order conservative, and nearest neighbor grid remapping methods. There are also options to ignore unmapped destination points, mask points on source and destination grids, and provide grid structure in the polar regions. Grid remapping on the sphere takes place in 3D Cartesian space, so the pole problem is not an issue as it can be with other grid remapping software. Remapping can be done between any combination of 2D and 3D logically rectangular and unstructured grids with overlapping domains. Grid pairs where one side of the regridding is represented by an appropriate set of unconnected data points, as is commonly found with observational data streams, is also supported. There is a developing interoperability layer between ESMPy and OpenClimateGIS (OCGIS). OCGIS is a pure Python, open source package designed for geospatial manipulation, subsetting, and computation on climate datasets stored in local NetCDF files or accessible remotely via the OPeNDAP protocol. Interfacing with OCGIS has brought GIS-like functionality to ESMPy (i.e. subsetting, coordinate transformations) as well as additional file output formats (i.e. CSV, ESRI Shapefile). ESMPy is distinguished by its strong emphasis on open source, community governance, and distributed development. The user base has grown quickly, and the package is integrating with several other software tools and frameworks. These include the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), Iris, PyFerret, cfpython, and the Community Surface Dynamics Modeling System (CSDMS). ESMPy minimum requirements include Python 2.6, Numpy 1.6.1 and an ESMF installation. Optional dependencies include NetCDF and OCGIS-related dependencies: GDAL, Shapely, and Fiona. ESMPy is regression tested nightly, and supported on Darwin, Linux and Cray systems with the GNU compiler suite and MPI communications. OCGIS is supported on Linux, and also undergoes nightly regression testing. Both packages are installable from Anaconda channels. Upcoming development plans for ESMPy involve development of a higher order conservative grid remapping method. Future OCGIS development will focus on mesh and location stream interoperability and streamlined access to ESMPy's MPI implementation.

  18. C-5M Super Galaxy Utilization with Joint Precision Airdrop System

    DTIC Science & Technology

    2012-03-22

    System Notes FireFly 900-2,200 Steerable Parafoil Screamer 500-2,200 Steerable Parafoil w/additional chutes to slow touchdown Dragonfly...setting . This initial feasible solution provides the Nonlinear Program algorithm a starting point to continue its calculations. The model continues...provides the NLP with a starting point of 1. This provides the NLP algorithm a point within the feasible region to begin its calculations in an attempt

  19. COREnet: The Fusion of Social Network Analysis and Target Audience Analysis

    DTIC Science & Technology

    2014-12-01

    misunderstanding of MISO (PSYOP) not only in doctrine, but also in practice, is easily understood. MISO has a long history of name changes starting ...TAA does not strictly adhere to any particular theory; studying dynamics is a valid starting point for analysis, and is naturally congruent with the...provides a starting point for further analysis. The PO is a pre-approved objective by the Office of the Secretary of Defense (OSD) (JP 3–53, 2003, V-1

  20. The complex spine: the multidimensional system of causal pathways for low-back disorders.

    PubMed

    Marras, William S

    2012-12-01

    The aim of this study was to examine the logic behind the knowledge of low-back problem causal pathways. Low-back pain and low-back disorders (LBDs) continue to represent the major musculoskeletal risk problem in the workplace,with the prevalence and costs of such disorders increasing over time. In recent years, there has been much criticism of the ability of ergonomics methods to control the risk of LBDs. Logical assessment of the systems logic associated with our understanding and prevention of LBDs. Current spine loading as well as spine tolerance research efforts are bringing the field to the point where there is a better systems understanding of the inextricable link between the musculoskeletal system and the cognitive system. Loading is influenced by both the physical environment factors as well as mental demands, whereas tolerances are defined by both physical tissue tolerance and biochemically based tissue sensitivities to pain. However, the logic used in many low-back risk assessment tools may be overly simplistic, given what is understood about causal pathways. Current tools typically assess only load or position in a very cursory manner. Efforts must work toward satisfying both the physical environment and the cognitive environment for the worker if one is to reliably lower the risk of low-back problems. This systems representation of LBD development may serve as a guide to identify gaps in our understanding of LBDs.

  1. An Alternative Starting Point for Fraction Instruction

    ERIC Educational Resources Information Center

    Cortina, José Luis; Višnovská, Jana; Zúñiga, Claudia

    2015-01-01

    We analyze the results of a study conducted for the purpose of assessing the viability of an alternative starting point for teaching fractions. The alternative is based on Freudenthal's insights about fraction as comparison. It involves portraying the entities that unit fractions quantify as always being apart from the reference unit, instead of…

  2. 33 CFR 165.915 - Security zones; Captain of the Port Detroit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the starting point at 41°58.4′ N, 083°15.4′ W (NAD 83). (2) Davis Besse Nuclear Power Station. All... to the starting point 41°36.1′ N, 083°04.7′ W (NAD 83). (b) Regulations. (1) In accordance with § 165...

  3. Length, Area, and Volume--or, Just Geometry Really

    ERIC Educational Resources Information Center

    Ball, Derek

    2012-01-01

    Many delegates at "conference" relish the opportunity, and the space, to "do some mathematics". Opportunity and space help to make the experience memorable, but how often is the quality of the starting point, or question acknowledged? Here is a set of starting points or problems that invite the reader to "do some mathematics". Deliberately, no…

  4. Model-based approaches to deal with detectability: a comment on Hutto (2016)

    USGS Publications Warehouse

    Marques, Tiago A.; Thomas, Len; Kéry, Marc; Buckland, Steve T.; Borchers, David L.; Rexstad, Eric; Fewster, Rachel M.; MacKenzie, Darryl I.; Royle, Andy; Guillera-Arroita, Gurutzeta; Handel, Colleen M.; Pavlacky, David C.; Camp, Richard J.

    2017-01-01

    In a recent paper, Hutto (2016a) challenges the need to account for detectability when interpreting data from point counts. A number of issues with model-based approaches to deal with detectability are presented, and an alternative suggested: surveying an area around each point over which detectability is assumed certain. The article contains a number of false claims and errors of logic, and we address these here. We provide suggestions about appropriate uses of distance sampling and occupancy modeling, arising from an intersection of design- and model-based inference.

  5. The influence of plan modulation on the interplay effect in VMAT liver SBRT treatments.

    PubMed

    Hubley, Emily; Pierce, Greg

    2017-08-01

    Volumetric modulated arc therapy (VMAT) uses multileaf collimator (MLC) leaves, gantry speed, and dose rate to modulate beam fluence, producing the highly conformal doses required for liver radiotherapy. When targets that move with respiration are treated with a dynamic fluence, there exists the possibility for interplay between the target and leaf motions. This study employs a novel motion simulation technique to determine if VMAT liver SBRT plans with an increase in MLC leaf modulation are more susceptible to dosimetric differences in the GTV due to interplay effects. For ten liver SBRT patients, two VMAT plans with different amounts of MLC leaf modulation were created. Motion was simulated using a random starting point in the respiratory cycle for each fraction. To isolate the interplay effect, motion was also simulated using four specific starting points in the respiratory cycle. The dosimetric differences caused by different starting points were examined by subtracting resultant dose distributions from each other. When motion was simulated using random starting points for each fraction, or with specific starting points, there were significantly more dose differences in the GTV (maximum 100cGy) for more highly modulated plans, but the overall plan quality was not adversely affected. Plans with more MLC leaf modulation are more susceptible to interplay effects, but dose differences in the GTV are clinically negligible in magnitude. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Economical launching and accelerating control strategy for a single-shaft parallel hybrid electric bus

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Song, Jian; Li, Liang; Li, Shengbo; Cao, Dongpu

    2016-08-01

    This paper presents an economical launching and accelerating mode, including four ordered phases: pure electrical driving, clutch engagement and engine start-up, engine active charging, and engine driving, which can be fit for the alternating conditions and improve the fuel economy of hybrid electric bus (HEB) during typical city-bus driving scenarios. By utilizing the fast response feature of electric motor (EM), an adaptive controller for EM is designed to realize the power demand during the pure electrical driving mode, the engine starting mode and the engine active charging mode. Concurrently, the smoothness issue induced by the sequential mode transitions is solved with a coordinated control logic for engine, EM and clutch. Simulation and experimental results show that the proposed launching and accelerating mode and its control methods are effective in improving the fuel economy and ensure the drivability during the fast transition between the operation modes of HEB.

  7. 33 CFR 165.911 - Security Zones; Captain of the Port Buffalo Zone.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...′ N, 076°23.2′ W; and then following the shoreline back to the point of origin (NAD 83). (2) Ginna... 43°16.7′ N, 077°18.3′ W; then following the shoreline back to starting point (NAD 83). (3) Moses... northwest to 45°00.36′ N, 074°48.16′ W; then northeast back to the starting point (NAD 83). (4) Long Sault...

  8. Transnational Discourses of Knowledge and Learning in Professional Work: Examples from Computer Engineering

    ERIC Educational Resources Information Center

    Nerland, Monika

    2010-01-01

    Taking a Foucauldian framework as its point of departure, this paper discusses how transnational discourses of knowledge and learning operate in the profession of computer engineering and form a certain logic through which modes of being an engineer are regulated. Both the knowledge domain of computer engineering and its related labour market is…

  9. Where Have All the Teachers Gone: A Case Study in Transitioning

    ERIC Educational Resources Information Center

    Potgieter, Amanda S.

    2016-01-01

    This paper reports the autobiographical narrative of Mr. L., as case-in-point example of the thresholding moment and the process of transitioning into Academia. The role of the lecturer-mentor and the multi-logic space that facilitates the process are clarified. I use hermeneutic phenomenology and interpretivism as methodological tools. This ex…

  10. The Use of Visual Approach in Teaching and Learning the Epsilon-Delta Definition of Continuity

    ERIC Educational Resources Information Center

    Pešic, Duška; Pešic, Aleksandar

    2015-01-01

    In this paper we introduce a new collaborative technique in teaching and learning the epsilon-delta definition of a continuous function at the point from its domain, which connects mathematical logic, combinatorics and calculus. This collaborative approach provides an opportunity for mathematical high school students to engage in mathematical…

  11. Dialogism: Feminist Revision of Argumentative Writing Instruction

    ERIC Educational Resources Information Center

    Kerkhoff, Shea N.

    2015-01-01

    According to the Common Core State Standards (CCSS), to be college and career ready students must be able to construct logical arguments using facts and reason. A feminist perspective provides an alternative point of view on the value of argumentation. The purpose of this study was to question the theories that frame the current CCSS 9-12 English…

  12. 75 FR 54219 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... microprocessor-based systems. NJT proposes to verify and test signal locking systems controlled by microprocessor... interlocking, controlled points and other locations are controlled by solid-state vital microprocessor-based... components for control of both vital and non-vital functions. The logic does not change once a microprocessor...

  13. Audit Cultures, Commodification, and Class and Race Strategies in Education

    ERIC Educational Resources Information Center

    Apple, Michael W.

    2005-01-01

    The author discusses some of the ways in which certain elements of conservative modernization have had an impact on education at multiple levels. He points to the growth of commodifying logics and the audit culture that accompanies them. In the process, he highlights a number of dangers currently being faced. However, he urges us not to assume…

  14. Mitigation of adverse interactions in pairs of clinical practice guidelines using constraint logic programming.

    PubMed

    Wilk, Szymon; Michalowski, Wojtek; Michalowski, Martin; Farion, Ken; Hing, Marisela Mainegra; Mohapatra, Subhra

    2013-04-01

    We propose a new method to mitigate (identify and address) adverse interactions (drug-drug or drug-disease) that occur when a patient with comorbid diseases is managed according to two concurrently applied clinical practice guidelines (CPGs). A lack of methods to facilitate the concurrent application of CPGs severely limits their use in clinical practice and the development of such methods is one of the grand challenges for clinical decision support. The proposed method responds to this challenge. We introduce and formally define logical models of CPGs and other related concepts, and develop the mitigation algorithm that operates on these concepts. In the algorithm we combine domain knowledge encoded as interaction and revision operators using the constraint logic programming (CLP) paradigm. The operators characterize adverse interactions and describe revisions to logical models required to address these interactions, while CLP allows us to efficiently solve the logical models - a solution represents a feasible therapy that may be safely applied to a patient. The mitigation algorithm accepts two CPGs and available (likely incomplete) patient information. It reports whether mitigation has been successful or not, and on success it gives a feasible therapy and points at identified interactions (if any) together with the revisions that address them. Thus, we consider the mitigation algorithm as an alerting tool to support a physician in the concurrent application of CPGs that can be implemented as a component of a clinical decision support system. We illustrate our method in the context of two clinical scenarios involving a patient with duodenal ulcer who experiences an episode of transient ischemic attack. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Context recognition for a hyperintensional inference machine

    NASA Astrophysics Data System (ADS)

    Duží, Marie; Fait, Michal; Menšík, Marek

    2017-07-01

    The goal of this paper is to introduce the algorithm of context recognition in the functional programming language TIL-Script, which is a necessary condition for the implementation of the TIL-Script inference machine. The TIL-Script language is an operationally isomorphic syntactic variant of Tichý's Transparent Intensional Logic (TIL). From the formal point of view, TIL is a hyperintensional, partial, typed λ-calculus with procedural semantics. Hyperintensional, because TIL λ-terms denote procedures (defined as TIL constructions) producing set-theoretic functions rather than the functions themselves; partial, because TIL is a logic of partial functions; and typed, because all the entities of TIL ontology, including constructions, receive a type within a ramified hierarchy of types. These features make it possible to distinguish three levels of abstraction at which TIL constructions operate. At the highest hyperintensional level the object to operate on is a construction (though a higher-order construction is needed to present this lower-order construction as an object of predication). At the middle intensional level the object to operate on is the function presented, or constructed, by a construction, while at the lowest extensional level the object to operate on is the value (if any) of the presented function. Thus a necessary condition for the development of an inference machine for the TIL-Script language is recognizing a context in which a construction occurs, namely extensional, intensional and hyperintensional context, in order to determine the type of an argument at which a given inference rule can be properly applied. As a result, our logic does not flout logical rules of extensional logic, which makes it possible to develop a hyperintensional inference machine for the TIL-Script language.

  16. The Personal Responsibility and Work Opportunity Act of 1996: What Welfare Reform Means for Head Start.

    ERIC Educational Resources Information Center

    Shuell, Julie; Hanna, Jeff; Oterlei, Jannell; Kariger, Patricia

    This National Head Start Association booklet outlines the main provisions of the Personal Responsibility and Work Opportunity Act and describes how it may affect local Head Start Programs. The document is intended to serve as a starting point for local programs, parents, administrators and policy workers to discuss and plan how Head Start will…

  17. Controls in new construction reactors-factory testing of the non-safety portion of the Lungmen nuclear power plant distributed control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Y. S.; Dick, J. W.; Tetirick, C. W.

    2006-07-01

    The construction permit for Taipower's Lungmen Nuclear Units 1 and 2, two ABWR plants, was issued on March 17, 1999[1], The construction of these units is progressing actively at site. The digital I and C system supplied by GE, which is designated as the Distributed Control and Information System (DCIS) in this project, is being implemented primarily at one vendor facility. In order to ensure the reliability, safety and availability of the DCIS, it is required to comprehensively test the whole DCIS in factory. This article describes the test requirements and acceptance criteria for functional testing of the Non-Safety Distributedmore » Control and Information system (DCIS) for Taiwan Power's Lungmen Units 1 and 2 GE selected Invensys as the equipment supplier for this Non-Safety portion of DCIS. The DCIS system of the Lungmen Units is a physically distributed control system. Field transmitters are connected to hard I/O terminal inputs on the Invensys I/A system. Once the signal is digitized on FBMs (Field Bus Modules) in Remote Multiplexing Units (RMUs), the signal is passed into an integrated control software environment. Control is based on the concept of compounds and blocks where each compound is a logical collection of blocks that performs a control function. Each point identified by control compound and block can be individually used throughout the DCIS system by referencing its unique name. In the Lungmen Project control logic and HSI (Human System Interface) requirements are divided into individual process systems called MPLs (Master Parts List). Higher-level Plant Computer System (PCS) algorithms access control compounds and blocks in these MPLs to develop functions. The test requirements and acceptance criteria for the DCIS system of the Lungmen Project are divided into three general categories (see 1,2,3 below) of verification, which in turn are divided into several specific tests: 1. DCIS System Physical Checks a) RMU Test - To confirm that the hard I/O database is installed on the DCIS and is physically addressed correctly. Test process is injecting a signal at each DCIS hard I/O terminal boundary and verifying correct receipt on the DCIS. b) DCIS Network Stress Test - Confirms system viability under extreme high load conditions beyond the plant could ever experience. Load conditions include alarm showers on the DCIS system to emulate plant upsets. c) System Hardware Configuration Test - These are typical checks of the DCIS system hardware including fault reporting, redundancy, and normal computer functions. d) Performance Test - Test confirms high level hardware and system capability attributes such as control system time response, 'cold start' reboots, and processor loading e) Electromagnetic compatibility tests - To verify the electromagnetic viability of the system and individual components 2. Implementation of Plant Systems and Systems Integration a) MPL Logic Tests -To confirm control functions implemented to system logic performs as expected, and that parameters are passed correctly between system control schemes. b) Data Link (Gateway) Tests- To verify third party interfaces to the DCIS. c) Plant Computer System (PCS) Logic Tests- Tests to verify that higher-level PCS logic is correctly implemented, performs as expected, and parameters are passed correctly between PCS sub-systems and MPL systems. Included the PCS sub-systems, Safety Parameter Display System, Historian, Alarms, Maintenance monitoring etc. 3. Unique Third Party Interfacing and Integration into the DCIS The set of controls for Automatic Power Regulation, Feedwater, and Recirculation Flow are specific in that these systems are implemented on third party Triple Modular Redundant (TMR) hardware, which was connected to the DCIS and are tested via full simulation. The TMR system is supplied by GE Control Solutions on the Mark Vie platform. (authors)« less

  18. Boolean Modeling of Neural Systems with Point-Process Inputs and Outputs. Part I: Theory and Simulations

    PubMed Central

    Marmarelis, Vasilis Z.; Zanos, Theodoros P.; Berger, Theodore W.

    2010-01-01

    This paper presents a new modeling approach for neural systems with point-process (spike) inputs and outputs that utilizes Boolean operators (i.e. modulo 2 multiplication and addition that correspond to the logical AND and OR operations respectively, as well as the AND_NOT logical operation representing inhibitory effects). The form of the employed mathematical models is akin to a “Boolean-Volterra” model that contains the product terms of all relevant input lags in a hierarchical order, where terms of order higher than first represent nonlinear interactions among the various lagged values of each input point-process or among lagged values of various inputs (if multiple inputs exist) as they reflect on the output. The coefficients of this Boolean-Volterra model are also binary variables that indicate the presence or absence of the respective term in each specific model/system. Simulations are used to explore the properties of such models and the feasibility of their accurate estimation from short data-records in the presence of noise (i.e. spurious spikes). The results demonstrate the feasibility of obtaining reliable estimates of such models, with excitatory and inhibitory terms, in the presence of considerable noise (spurious spikes) in the outputs and/or the inputs in a computationally efficient manner. A pilot application of this approach to an actual neural system is presented in the companion paper (Part II). PMID:19517238

  19. Forbidding Undesirable Agreements: A Dependence-Based Approach to the Regulation of Multi-agent Systems

    NASA Astrophysics Data System (ADS)

    Turrini, Paolo; Grossi, Davide; Broersen, Jan; Meyer, John-Jules Ch.

    The purpose of this contribution is to set up a language to evaluate the results of concerted action among interdependent agents against predetermined properties that we can recognise as desirable from a deontic point of view. Unlike the standard view of logics to reason about coalitionally rational action, the capacity of a set of agents to take a rational decision will be restricted to what we will call agreements, that can be seen as solution concepts to a dependence structure present in a certain game. The language will identify in concise terms those agreements that act accordingly or disaccordingly with the desirable properties arbitrarily set up in the beginning, and will reveal, by logical reasoning, a variety of structural properties of this type of collective action.

  20. Failure detection in high-performance clusters and computers using chaotic map computations

    DOEpatents

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  1. Weight for Stephen Finlay.

    PubMed

    Evers, Daan

    2013-04-01

    According to Stephen Finlay, ' A ought to X ' means that X -ing is more conducive to contextually salient ends than relevant alternatives. This in turn is analysed in terms of probability. I show why this theory of 'ought' is hard to square with a theory of a reason's weight which could explain why ' A ought to X ' logically entails that the balance of reasons favours that A X -es. I develop two theories of weight to illustrate my point. I first look at the prospects of a theory of weight based on expected utility theory. I then suggest a simpler theory. Although neither allows that ' A ought to X ' logically entails that the balance of reasons favours that A X -es, this price may be accepted. For there remains a strong pragmatic relation between these claims.

  2. Japanese project aims at supercomputer that executes 10 gflops

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burskey, D.

    1984-05-03

    Dubbed supercom by its multicompany design team, the decade-long project's goal is an engineering supercomputer that can execute 10 billion floating-point operations/s-about 20 times faster than today's supercomputers. The project, guided by Japan's Ministry of International Trade and Industry (MITI) and the Agency of Industrial Science and Technology encompasses three parallel research programs, all aimed at some angle of the superconductor. One program should lead to superfast logic and memory circuits, another to a system architecture that will afford the best performance, and the last to the software that will ultimately control the computer. The work on logic and memorymore » chips is based on: GAAS circuit; Josephson junction devices; and high electron mobility transistor structures. The architecture will involve parallel processing.« less

  3. An Examination of the Starting Point Approach to Design and Technology

    ERIC Educational Resources Information Center

    Good, Keith; Jarvinen, Esa-Matti

    2007-01-01

    This study examines the Starting Point Approach (SPA) to design and technology, which is intended to maximize creativity while being manageable for the teacher. The purpose of the study was to examine whether the children could do what the approach requires and in particular whether it promoted their innovative thinking. Data were collected during…

  4. The Use of Mixed Methods in Randomized Control Trials

    ERIC Educational Resources Information Center

    White, Howard

    2013-01-01

    Evaluations should be issues driven, not methods driven. The starting point should be priority programs to be evaluated or policies to be tested. From this starting point, a list of evaluation questions is identified. For each evaluation question, the task is to identify the best available method for answering that question. Hence it is likely…

  5. In the Heart of Teaching: A Two-Dimensional Conception of Teachers' Relational Competence

    ERIC Educational Resources Information Center

    Aspelin, Jonas

    2017-01-01

    Research reveals that teachers' relational competence is crucial for successful education. However, the field is still small and largely unexplored, and arguably needs a better and more precise theoretical starting point. This article seeks to help establish such a starting point, aiming to outline a relational framework based on the philosophies…

  6. Computers Don't Byte. A Starting Point for Teachers Using Computers. A Resource Booklet.

    ERIC Educational Resources Information Center

    Lieberman, Michael; And Others

    Designed to provide a starting point for the teacher without computer experience, this booklet deals with both the "how" and the "when" of computers in education. Educational applications described include classroom uses with the student as a passive or an active user and programs for the handicapped; the purpose of computers…

  7. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    DTIC Science & Technology

    2016-06-01

    out China, Pakistan, and India as having a minimalist point of view with regards to nuclear weapons. For those in favor of this approach, he does...Referee event graph The referee listens to the start and stops of the mover and determines whether or not the Polaris has entered or exited the...are highlighted in Figure 17: • Polaris start point • Polaris end point • Polaris original waypoints • Polaris ad hoc waypoints • Number of

  8. The Recovery of TOMS-EP

    NASA Technical Reports Server (NTRS)

    Robertson, Brent; Sabelhaus, Phil; Mendenhall, Todd; Fesq, Lorraine

    1998-01-01

    On December 13th 1998, the Total Ozone Mapping Spectrometer - Earth Probe (TOMS-EP) spacecraft experienced a Single Event Upset which caused the system to reconfigure and enter a Safe Mode. This incident occurred two and a half years after the launch of the spacecraft which was designed for a two year life. A combination of factors, including changes in component behavior due to age and extended use, very unfortunate initial conditions and the safe mode processing logic prevented the spacecraft from entering its nominal long term storage mode. The spacecraft remained in a high fuel consumption mode designed for temporary use. By the time the onboard fuel was exhausted, the spacecraft was Sun pointing in a high rate flat spin. Although the uncontrolled spacecraft was initially in a power and thermal safe orientation, it would not stay in this state indefinitely due to a slow precession of its momentum vector. A recovery team was immediately assembled to determine if there was time to develop a method of despinning the vehicle and return it to normal science data collection. A three stage plan was developed that used the onboard magnetic torque rods as actuators. The first stage was designed to reduce the high spin rate to within the linear range of the gyros. The second stage transitioned the spacecraft from sun pointing to orbit reference pointing. The final stage returned the spacecraft to normal science operation. The entire recovery scenario was simulated with a wide range of initial conditions to establish the expected behavior. The recovery sequence was started on December 28th 1998 and completed by December 31st. TOMS-EP was successfully returned to science operations by the beginning of 1999. This paper describes the TOMS-EP Safe Mode design and the factors which led to the spacecraft anomaly and loss of fuel. The recovery and simulation efforts are described. Flight data are presented which show the performance of the spacecraft during its return to science. Finally, lessons learned are presented.

  9. Fuzzy Logic-based Intelligent Scheme for Enhancing QoS of Vertical Handover Decision in Vehicular Ad-hoc Networks

    NASA Astrophysics Data System (ADS)

    Azzali, F.; Ghazali, O.; Omar, M. H.

    2017-08-01

    The design of next generation networks in various technologies under the “Anywhere, Anytime” paradigm offers seamless connectivity across different coverage. A conventional algorithm such as RSSThreshold algorithm, that only uses the received strength signal (RSS) as a metric, will decrease handover performance regarding handover latency, delay, packet loss, and handover failure probability. Moreover, the RSS-based algorithm is only suitable for horizontal handover decision to examine the quality of service (QoS) compared to the vertical handover decision in advanced technologies. In the next generation network, vertical handover can be started based on the user’s convenience or choice rather than connectivity reasons. This study proposes a vertical handover decision algorithm that uses a Fuzzy Logic (FL) algorithm, to increase QoS performance in heterogeneous vehicular ad-hoc networks (VANET). The study uses network simulator 2.29 (NS 2.29) along with the mobility traffic network and generator to implement simulation scenarios and topologies. This helps the simulation to achieve a realistic VANET mobility scenario. The required analysis on the performance of QoS in the vertical handover can thus be conducted. The proposed Fuzzy Logic algorithm shows improvement over the conventional algorithm (RSSThreshold) in the average percentage of handover QoS whereby it achieves 20%, 21% and 13% improvement on handover latency, delay, and packet loss respectively. This is achieved through triggering a process in layer two and three that enhances the handover performance.

  10. Training Software in Artificial-Intelligence Computing Techniques

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna; Rogstad, Eric; Chalfant, Eugene

    2005-01-01

    The Artificial Intelligence (AI) Toolkit is a computer program for training scientists, engineers, and university students in three soft-computing techniques (fuzzy logic, neural networks, and genetic algorithms) used in artificial-intelligence applications. The program promotes an easily understandable tutorial interface, including an interactive graphical component through which the user can gain hands-on experience in soft-computing techniques applied to realistic example problems. The tutorial provides step-by-step instructions on the workings of soft-computing technology, whereas the hands-on examples allow interaction and reinforcement of the techniques explained throughout the tutorial. In the fuzzy-logic example, a user can interact with a robot and an obstacle course to verify how fuzzy logic is used to command a rover traverse from an arbitrary start to the goal location. For the genetic algorithm example, the problem is to determine the minimum-length path for visiting a user-chosen set of planets in the solar system. For the neural-network example, the problem is to decide, on the basis of input data on physical characteristics, whether a person is a man, woman, or child. The AI Toolkit is compatible with the Windows 95,98, ME, NT 4.0, 2000, and XP operating systems. A computer having a processor speed of at least 300 MHz, and random-access memory of at least 56MB is recommended for optimal performance. The program can be run on a slower computer having less memory, but some functions may not be executed properly.

  11. Evolving Digital Ecological Networks

    PubMed Central

    Wagner, Aaron P.; Ofria, Charles

    2013-01-01

    “It is hard to realize that the living world as we know it is just one among many possibilities” [1]. Evolving digital ecological networks are webs of interacting, self-replicating, and evolving computer programs (i.e., digital organisms) that experience the same major ecological interactions as biological organisms (e.g., competition, predation, parasitism, and mutualism). Despite being computational, these programs evolve quickly in an open-ended way, and starting from only one or two ancestral organisms, the formation of ecological networks can be observed in real-time by tracking interactions between the constantly evolving organism phenotypes. These phenotypes may be defined by combinations of logical computations (hereafter tasks) that digital organisms perform and by expressed behaviors that have evolved. The types and outcomes of interactions between phenotypes are determined by task overlap for logic-defined phenotypes and by responses to encounters in the case of behavioral phenotypes. Biologists use these evolving networks to study active and fundamental topics within evolutionary ecology (e.g., the extent to which the architecture of multispecies networks shape coevolutionary outcomes, and the processes involved). PMID:23533370

  12. Fuzzy Logic Based Autonomous Parallel Parking System with Kalman Filtering

    NASA Astrophysics Data System (ADS)

    Panomruttanarug, Benjamas; Higuchi, Kohji

    This paper presents an emulation of fuzzy logic control schemes for an autonomous parallel parking system in a backward maneuver. There are four infrared sensors sending the distance data to a microcontroller for generating an obstacle-free parking path. Two of them mounted on the front and rear wheels on the parking side are used as the inputs to the fuzzy rules to calculate a proper steering angle while backing. The other two attached to the front and rear ends serve for avoiding collision with other cars along the parking space. At the end of parking processes, the vehicle will be in line with other parked cars and positioned in the middle of the free space. Fuzzy rules are designed based upon a wall following process. Performance of the infrared sensors is improved using Kalman filtering. The design method needs extra information from ultrasonic sensors. Starting from modeling the ultrasonic sensor in 1-D state space forms, one makes use of the infrared sensor as a measurement to update the predicted values. Experimental results demonstrate the effectiveness of sensor improvement.

  13. Directional learning, but no spatial mapping by rats performing a navigational task in an inverted orientation

    PubMed Central

    Valerio, Stephane; Clark, Benjamin J.; Chan, Jeremy H. M.; Frost, Carlton P.; Harris, Mark J.; Taube, Jeffrey S.

    2010-01-01

    Previous studies have identified neurons throughout the rat limbic system that fire as a function of the animal's head direction (HD). This HD signal is particularly robust when rats locomote in the horizontal and vertical planes, but is severely attenuated when locomoting upside-down (Calton & Taube, 2005). Given the hypothesis that the HD signal represents an animal's sense of its directional heading, we evaluated whether rats could accurately navigate in an inverted (upside-down) orientation. The task required the animals to find an escape hole while locomoting inverted on a circular platform suspended from the ceiling. In experiment 1, Long-Evans rats were trained to navigate to the escape hole by locomoting from either one or four start points. Interestingly, no animals from the 4-start point group reached criterion, even after 30 days of training. Animals in the 1-start point group reached criterion after about 6 training sessions. In Experiment 2, probe tests revealed that animals navigating from either 1- or 2-start points utilized distal visual landmarks for accurate orientation. However, subsequent probe tests revealed that their performance was markedly attenuated when required to navigate to the escape hole from a novel starting point. This absence of flexibility while navigating upside-down was confirmed in experiment 3 where we show that the rats do not learn to reach a place, but instead learn separate trajectories to the target hole(s). Based on these results we argue that inverted navigation primarily involves a simple directional strategy based on visual landmarks. PMID:20109566

  14. Molecular Library Synthesis Using Complex Substrates: Expanding the Framework of Triterpenoids

    PubMed Central

    Ignatenko, Vasily A.; Han, Yong; Tochtrop, Gregory P.

    2013-01-01

    The remodelling of a natural product core framework by means of diversity-oriented synthesis (DOS) is a valuable approach to access diverse/biologically relevant chemical space and to overcome the limitations of combinatorial-type compounds. Here we provide proof of principle and a thorough conformational analysis for a general strategy whereby the inherent complexity of a starting material is used to define the regio- and stereochemical outcomes of reactions in chemical library construction. This is in contrast to the traditional DOS logic employing reaction development and catalysis to drive library diversity PMID:23245400

  15. Making sense out of spinal cord somatosensory development

    PubMed Central

    Seal, Rebecca P.

    2016-01-01

    The spinal cord integrates and relays somatosensory input, leading to complex motor responses. Research over the past couple of decades has identified transcription factor networks that function during development to define and instruct the generation of diverse neuronal populations within the spinal cord. A number of studies have now started to connect these developmentally defined populations with their roles in somatosensory circuits. Here, we review our current understanding of how neuronal diversity in the dorsal spinal cord is generated and we discuss the logic underlying how these neurons form the basis of somatosensory circuits. PMID:27702783

  16. InGaAs/InAlAs Double Quantum Wells as Starting Structures for Quantum Logic Gates

    NASA Astrophysics Data System (ADS)

    Marchewka, M.; Sheregii, E. M.

    2011-12-01

    The detection of both symmetric and anti-symmetric electron states in DQWs by an optical method is described in this paper. Values of the symmetric and anti-symmetric splitting (SAS-gap) determined in this way are used for interpretation of the beating effect in the SdH oscillations observed at low temperatures in the external magnetic field. SAS-splitting of electron states in DQWs clearly exists at room temperature and electrons in symmetric and anti-symmetric states have different statistics so these states can be identified in electron transport.

  17. Designing a Feasibility Study: A Starting Point for Considering New Management Initiatives for Working Parents.

    ERIC Educational Resources Information Center

    Friedman, Dana E.

    This brief paper was prepared as a starting point for employers considering the adoption of a new management initiative for working parents. It is not an exhaustive outline of all considerations in the decision-making process, nor does it provide solutions to all the known pitfalls. It does, however, suggest the potential scope and complexity of…

  18. An Investigation of Starting Point Preferences in Human Performance on Traveling Salesman Problems

    ERIC Educational Resources Information Center

    MacGregor, James N.

    2014-01-01

    Previous studies have shown that people start traveling sales problem tours significantly more often from boundary than from interior nodes. There are a number of possible reasons for such a tendency: first, it may arise as a direct result of the processes involved in tour construction; second, boundary points may be perceptually more salient than…

  19. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery

    PubMed Central

    Sykes, Melissa L.; Jones, Amy J.; Shelper, Todd B.; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E.

    2017-01-01

    ABSTRACT Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro. Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. PMID:28674055

  20. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery.

    PubMed

    Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M

    2017-09-01

    Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.

  1. JumpStart III Final Report.

    ERIC Educational Resources Information Center

    Cohen, Arthur M.; Brawer, Florence B.; Kozeracki, Carol A.

    This final report for the JumpStart III program presents a summary of the entrepreneurship training programs developed by each of the four JumpStart III partners selected in March 1997. Grants for the colleges totaled $354,546 over 2 years. The Jumpstart funding has been only a starting point for these and the other 12 Jumpstart partners in…

  2. How I Learned to Love Athletic Recruits

    ERIC Educational Resources Information Center

    Sacken, Mike

    2008-01-01

    The author does not think of himself as a logical candidate to help first-generation college athletes graduate. He is 59 and middle class, not a former athlete or a first-generation college graduate, and obviously not hip. More to the point, he is white and Texas-born, and he attended segregated schools his whole student life. He was even at the…

  3. School Leadership Practices That Promote Effective Whole School Behaviour Management: A Study of Australian Primary Schools

    ERIC Educational Resources Information Center

    De Nobile, John; El Baba, Mariam; London, Teola

    2016-01-01

    When considering the management of student behaviour issues, a substantial body of literature, as well as logical common sense, points to the advantages of whole school policy over the individual efforts of teachers. Less is known, however, about the direct or indirect role school leadership plays in the development of well-implemented whole…

  4. Nuclear Hardness Evaluation Procedures for the Preliminary Assessment of the FLEETSATCOM Attitude and Velocity Control Subsystem.

    DTIC Science & Technology

    1979-12-01

    processing holding register upset times. Therefore reaction wh these transient response times will not significantly affect pointing of SS7 -20 a error...change so that the requirements of SS7 -20 are not met. Command Logic and Power Switching I Transients whall not cause mode changes to occur in the CEA

  5. Strategies to crack well-guarded markets.

    PubMed

    Bryce, David J; Dyer, Jeffrey H

    2007-05-01

    How can companies break into attractive markets, where incumbents erect many barriers to entry? To answer this question, the authors studied organizations that successfully entered the most profitable industries in the United States between 1990 and 2000. When they dissected the strategies that worked best, one common theme stood out: indirect assault. Smart newcomers don't duplicate existing business models, compete for crowded distribution channels, or go after mainstream customers right away. Instead, they attack the enemy at its weakest points; then gain competitive advantage; and later, if doing so meets their objectives, go after its strongholds. Recent battles in the soft drink industry--where brands, bottling and distribution capabilities, and shelf space are incumbents' main advantages--are a case in point. When Virgin Drinks entered the U.S. cola market in 1998, it advertised heavily and immediately tried to get into the retail outlets that stock the leading brands. Virgin has never garnered more than a 1% share of the market. Red Bull, by contrast, came on the scene in 1997 with a niche product: a carbonated energy drink. The company started by selling the drink at bars and nightclubs. After gaining a loyal following through these outlets, Red Bull elbowed its way into the corner store. In 2005 it enjoyed a 65% share of the $650 million energy drink market. Successful entrants use three basic approaches in their indirect attacks. They leverage their existing assets and resources, reconfigure their value chains, and create niches. These approaches may appear to be simple, but their magic lies in their combination. By mixing and matching them, Bryce and Dyer say, enterprises can defy half a century of economic logic and make money entering highly profitable industries. The authors use Skype, Costco, Skechers, and many other companies to illustrate their argument.

  6. Research in the design of high-performance reconfigurable systems

    NASA Technical Reports Server (NTRS)

    Slotnick, D. L.; Mcewan, S. D.; Spry, A. J.

    1984-01-01

    An initial design for the Bit Processor (BP) referred to in prior reports as the Processing Element or PE has been completed. Eight BP's, together with their supporting random-access memory, a 64 k x 9 ROM to perform addition, routing logic, and some additional logic, constitute the components of a single stage. An initial stage design is given. Stages may be combined to perform high-speed fixed or floating point arithmetic. Stages can be configured into a range of arithmetic modules that includes bit-serial one or two-dimensional arrays; one or two dimensional arrays fixed or floating point processors; and specialized uniprocessors, such as long-word arithmetic units. One to eight BP's represent a likely initial chip level. The Stage would then correspond to a first-level pluggable module. As both this project and VLSI CAD/CAM progress, however, it is expected that the chip level would migrate upward to the stage and, perhaps, ultimately the box level. The BP RAM, consisting of two banks, holds only operands and indices. Programs are at the box (high-level function) and system level. At the system level initial effort has been concentrated on specifying the tools needed to evaluate design alternatives.

  7. The evolvability of programmable hardware.

    PubMed

    Raman, Karthik; Wagner, Andreas

    2011-02-06

    In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected 'neutral networks' in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 10(45) logic circuits ('genotypes') and 10(19) logic functions ('phenotypes'). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry.

  8. ASICs Approach for the Implementation of a Symmetric Triangular Fuzzy Coprocessor and Its Application to Adaptive Filtering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan

    1997-01-01

    This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.

  9. The evolvability of programmable hardware

    PubMed Central

    Raman, Karthik; Wagner, Andreas

    2011-01-01

    In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected ‘neutral networks’ in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 1045 logic circuits (‘genotypes’) and 1019 logic functions (‘phenotypes’). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry. PMID:20534598

  10. Organizational Change Around an Older Workforce.

    PubMed

    Moen, Phyllis; Kojola, Erik; Schaefers, Kate

    2017-10-01

    Demographic, economic, political, and technological transformations-including an unprecedented older workforce-are challenging outdated human resource logics and practices. Rising numbers of retirement-eligible Boomers portend a loss of talent, skills, and local knowledge. We investigate organizational responses to this challenge-institutional work disrupting age-graded mindsets and policies. We focus on innovative U.S. organizations in the Minneapolis-St. Paul region in the state of Minnesota, a hub for businesses and nonprofits, conducting in-depth interviews with informants from a purposive sample of 23 for-profit, nonprofit, and government organizations. Drawing on an organizational change theoretical approach, we find organizations are leading change by developing universal policies and practices, not ones intentionally geared to older workers. Both their narratives and strategies-opportunities for greater employee flexibility, training, and scaling back time commitments-suggest deliberate disrupting of established age-graded logics, replacing them with new logics valuing older workers and age-neutral approaches. Organizations in the different sectors studied are fashioning uniform policies regardless of age, exhibiting a parallel reluctance to delineate special policies for older workers. Developing new organizational logics and practices valuing, investing in, and retaining older workers is key 21st century business challenges. The flexibility, training, and alternative pathways offered by the innovative organizations we studied point to fruitful possibilities for large-scale replacement of outdated age-biased templates of work, careers, and retirement. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Human action quality evaluation based on fuzzy logic with application in underground coal mining.

    PubMed

    Ionica, Andreea; Leba, Monica

    2015-01-01

    The work system is defined by its components, their roles and the relationships between them. Any work system gravitates around the human resource and the interdependencies between human factor and the other components of it. Researches in this field agreed that the human factor and its actions are difficult to quantify and predict. The objective of this paper is to apply a method of human actions evaluation in order to estimate possible risks and prevent possible system faults, both at human factor level and at equipment level. In order to point out the importance of the human factor influence on all the elements of the working systems we propose a fuzzy logic based methodology for quality evaluation of human actions. This methodology has a multidisciplinary character, as it gathers ideas and methods from: quality management, ergonomics, work safety and artificial intelligence. The results presented refer to a work system with a high degree of specificity, namely, underground coal mining and are valuable for human resources risk evaluation pattern. The fuzzy logic evaluation of the human actions leads to early detection of possible dangerous evolutions of the work system and alarm the persons in charge.

  12. Fuzzy logic based robotic controller

    NASA Technical Reports Server (NTRS)

    Attia, F.; Upadhyaya, M.

    1994-01-01

    Existing Proportional-Integral-Derivative (PID) robotic controllers rely on an inverse kinematic model to convert user-specified cartesian trajectory coordinates to joint variables. These joints experience friction, stiction, and gear backlash effects. Due to lack of proper linearization of these effects, modern control theory based on state space methods cannot provide adequate control for robotic systems. In the presence of loads, the dynamic behavior of robotic systems is complex and nonlinear, especially where mathematical modeling is evaluated for real-time operators. Fuzzy Logic Control is a fast emerging alternative to conventional control systems in situations where it may not be feasible to formulate an analytical model of the complex system. Fuzzy logic techniques track a user-defined trajectory without having the host computer to explicitly solve the nonlinear inverse kinematic equations. The goal is to provide a rule-based approach, which is closer to human reasoning. The approach used expresses end-point error, location of manipulator joints, and proximity to obstacles as fuzzy variables. The resulting decisions are based upon linguistic and non-numerical information. This paper presents a solution to the conventional robot controller which is independent of computationally intensive kinematic equations. Computer simulation results of this approach as obtained from software implementation are also discussed.

  13. Optically programmable encoder based on light propagation in two-dimensional regular nanoplates.

    PubMed

    Li, Ya; Zhao, Fangyin; Guo, Shuai; Zhang, Yongyou; Niu, Chunhui; Zeng, Ruosheng; Zou, Bingsuo; Zhang, Wensheng; Ding, Kang; Bukhtiar, Arfan; Liu, Ruibin

    2017-04-07

    We design an efficient optically controlled microdevice based on CdSe nanoplates. Two-dimensional CdSe nanoplates exhibit lighting patterns around the edges and can be realized as a new type of optically controlled programmable encoder. The light source is used to excite the nanoplates and control the logical position under vertical pumping mode by the objective lens. At each excitation point in the nanoplates, the preferred light-propagation routes are along the normal direction and perpendicular to the edges, which then emit out from the edges to form a localized lighting section. The intensity distribution around the edges of different nanoplates demonstrates that the lighting part with a small scale is much stronger, defined as '1', than the dark section, defined as '0', along the edge. These '0' and '1' are the basic logic elements needed to compose logically functional devices. The observed propagation rules are consistent with theoretical simulations, meaning that the guided-light route in two-dimensional semiconductor nanoplates is regular and predictable. The same situation was also observed in regular CdS nanoplates. Basic theoretical analysis and experiments prove that the guided light and exit position follow rules mainly originating from the shape rather than material itself.

  14. Building logical qubits in a superconducting quantum computing system

    NASA Astrophysics Data System (ADS)

    Gambetta, Jay M.; Chow, Jerry M.; Steffen, Matthias

    2017-01-01

    The technological world is in the midst of a quantum computing and quantum information revolution. Since Richard Feynman's famous `plenty of room at the bottom' lecture (Feynman, Engineering and Science23, 22 (1960)), hinting at the notion of novel devices employing quantum mechanics, the quantum information community has taken gigantic strides in understanding the potential applications of a quantum computer and laid the foundational requirements for building one. We believe that the next significant step will be to demonstrate a quantum memory, in which a system of interacting qubits stores an encoded logical qubit state longer than the incorporated parts. Here, we describe the important route towards a logical memory with superconducting qubits, employing a rotated version of the surface code. The current status of technology with regards to interconnected superconducting-qubit networks will be described and near-term areas of focus to improve devices will be identified. Overall, the progress in this exciting field has been astounding, but we are at an important turning point, where it will be critical to incorporate engineering solutions with quantum architectural considerations, laying the foundation towards scalable fault-tolerant quantum computers in the near future.

  15. Target-responsive DNA-capped nanocontainer used for fabricating universal detector and performing logic operations

    PubMed Central

    Wu, Li; Ren, Jinsong; Qu, Xiaogang

    2014-01-01

    Nucleic acids have become a powerful tool in nanotechnology because of their controllable diverse conformational transitions and adaptable higher-order nanostructure. Using single-stranded DNA probes as the pore-caps for various target recognition, here we present an ultrasensitive universal electrochemical detection system based on graphene and mesoporous silica, and achieve sensitivity with all of the major classes of analytes and simultaneously realize DNA logic gate operations. The concept is based on the locking of the pores and preventing the signal-reporter molecules from escape by target-induced the conformational change of the tailored DNA caps. The coupling of ‘waking up’ gatekeeper with highly specific biochemical recognition is an innovative strategy for the detection of various targets, able to compete with classical methods which need expensive instrumentation and sophisticated experimental operations. The present study has introduced a new electrochemical signal amplification concept and also adds a new dimension to the function of graphene-mesoporous materials hybrids as multifunctional nanoscale logic devices. More importantly, the development of this approach would spur further advances in important areas, such as point-of-care diagnostics or detection of specific biological contaminations, and hold promise for use in field analysis. PMID:25249622

  16. The foundation of Piaget's theories: mental and physical action.

    PubMed

    Beilin, H; Fireman, G

    1999-01-01

    Piaget's late theory of action and action implication was the realization of a long history of development. A review of that history shows the central place of action in all of his theoretical assertions, despite the waxing and waning of other important features of his theories. Action was said to be the primary source of knowledge with perception and language in secondary roles. Action is for the most part not only organized but there is logic in action. Action, which is at first physical, becomes internalized and transformed into mental action and mental representation, largely in the development of the symbolic or semiotic function in the sensorimotor period. A number of alternative theories of cognitive development place primary emphasis on mental representation. Piaget provided it with an important place as well, but subordinated it to mental action in the form of operations. In this, as Russell claims, he paralleled Schopenhauer's distinction between representation and will. Piaget's theory of action was intimately related to the gradual development of intentionality in childhood. Intentions were tied to actions by way of the conscious awareness of goals and the means to achieve them. Mental action, following the sensorimotor period, was limited in its logical form to semilogical or one-way functions. These forms were said by Piaget to lack logical reversibility, which was achieved only in the sixth or seventh year, in concrete operations. Mental action was not to be fully realized until the development of formal operations, with hypothetical reasoning, in adolescence, according to the classical Piagetian formulation. This view of the child's logical development, which relied heavily on truth-table (extensional) logic, underwent a number of changes. First from the addition of other logics: category theory and the theory of functions among them. In his last theory, however, an even more radical change occurred. With the collaboration of R. Garcia, he proposed a logic of meanings that would require a recasting of his earlier truth-table-based operatory logic that he claimed explained the development of logical thought and problem solving. The new logic of meanings, influenced by Anderson and Belnap's (1975) logic of entailment, placed new emphasis on inferential processes in the sensorimotor period, introduced protological forms in the actions of the very young child, and proposed that knowledge has an inferential dimension. The consequence was that the late theory shifted emphasis to intentional (qualitative) logic and meaning from the earlier extensional (quantitative) logic and truth testing. The profound changes in Piaget's late theory requires a serious reevaluation of Piaget's entire corpus of research and theory; a task which is yet to be done. Seen in a new light, the late theory is much closer to intellectual currents associated with hermeneutic and semiotic traditions in their concern with meaning and interpretation and less, if at all, with truth. This, despite Piaget's couching of the new theory in a logical mode. The late theory added significant new elements to the theory of action and action-implication, and suggest that Piaget's, and his collaborator's, new research data, which were interpreted within the new theoretical framework, require corroboration and review. The question as to whether Piaget's assertions are at root metaphorical and lack psychological reality, which has followed his theories from its earliest days, arises as well with the assertions of the late theory. Possibly, even more so, since even a limited historical review of his theories points to a considerable concurrence between changes in the fundamental assumptions of his theories and intellectual currents of the times. In hindsight, Piaget's theories appear as "works in progress," down to his last theory. Yet, even in the end, he charted the direction of possible further progress.

  17. Intra-organizational dynamics as drivers of entrepreneurship among physicians and managers in hospitals of western countries.

    PubMed

    Koelewijn, Wout T; Ehrenhard, Michel L; Groen, Aard J; van Harten, Wim H

    2012-09-01

    During the past decade, entrepreneurship in the healthcare sector has become increasingly important. The aging society, the continuous stream of innovative technologies and the growth of chronic illnesses are jeopardizing the sustainability of healthcare systems. In response, many European governments started to reform healthcare during the 1990s, replacing the traditional logic of medical professionalism with business-like logics. This trend is expected to continue as many governments will have to reduce their healthcare spending in response to the current growing budget deficits. In the process, entrepreneurship is being stimulated, yet little is known about intra-hospital dynamics leading to entrepreneurial behavior. The purpose of this article is to review existing literature concerning the influence of intra-organizational dynamics on entrepreneurship among physicians and managers in hospitals of Western countries. Therefore, we conducted a theory-led, systematic review of how intra-organizational dynamics among hospital managers and physicians can influence entrepreneurship. We designed our review using the neo-institutional framework of Greenwood and Hinings (1996). We analyze these dynamics in terms of power dependencies, interest dissatisfaction and value commitments. Our search revealed that physicians' dependence on hospital management has increased along with healthcare reforms and the resulting emphasis on business logics. This has induced various types of responses by physicians. Physicians can be pushed to adopt an entrepreneurial attitude as part of a defensive value commitment toward the business-like healthcare logic, to defend their traditionally dominant position and professional autonomy. In contrast, physicians holding a transformative attitude toward traditional medical professionalism seem more prone to adopt the entrepreneurial elements of business-like healthcare, encouraged by the prospect of increased autonomy and income. Interest dissatisfaction and competing value commitments can also stimulate physicians' entrepreneurship and, depending on their relative importance, determine whether it is necessity-based or opportunity-driven. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Applied cartographic communication: map symbolization for atlases.

    USGS Publications Warehouse

    Morrison, J.L.

    1984-01-01

    A detailed investigation of the symbolization used on general-purpose atlas reference maps. It indicates how theories of cartographic communication can be put into practice. Two major points emerge. First, that a logical scheme can be constructed from existing cartographic research and applied to an analysis of the choice of symbolization on a map. Second, the same structure appears to allow the cartographer to specify symbolization as a part of map design. An introductory review of cartographic communication is followed by an analysis of selected maps' usage of point, area and line symbols, boundaries, text and colour usage.-after Author

  19. Dexter: Data Extractor for scanned graphs

    NASA Astrophysics Data System (ADS)

    Demleitner, Markus

    2011-12-01

    The NASA Astrophysics Data System (ADS) now holds 1.3 million scanned pages, containing numerous plots and figures for which the original data sets are lost or inaccessible. The availability of scans of the figures can significantly ease the regeneration of the data sets. For this purpose, the ADS has developed Dexter, a Java applet that supports the user in this process. Dexter's basic functionality is to let the user manually digitize a plot by marking points and defining the coordinate transformation from the logical to the physical coordinate system. Advanced features include automatic identification of axes, tracing lines and finding points matching a template.

  20. Compensatable muon collider calorimeter with manageable backgrounds

    DOEpatents

    Raja, Rajendran

    2015-02-17

    A method and system for reducing background noise in a particle collider, comprises identifying an interaction point among a plurality of particles within a particle collider associated with a detector element, defining a trigger start time for each of the pixels as the time taken for light to travel from the interaction point to the pixel and a trigger stop time as a selected time after the trigger start time, and collecting only detections that occur between the start trigger time and the stop trigger time in order to thereafter compensate the result from the particle collider to reduce unwanted background detection.

  1. A Flexible VHDL Floating Point Module for Control Algorithm Implementation in Space Applications

    NASA Astrophysics Data System (ADS)

    Padierna, A.; Nicoleau, C.; Sanchez, J.; Hidalgo, I.; Elvira, S.

    2012-08-01

    The implementation of control loops for space applications is an area with great potential. However, the characteristics of this kind of systems, such as its wide dynamic range of numeric values, make inadequate the use of fixed-point algorithms.However, because the generic chips available for the treatment of floating point data are, in general, not qualified to operate in space environments and the possibility of using an IP module in a FPGA/ASIC qualified for space is not viable due to the low amount of logic cells available for these type of devices, it is necessary to find a viable alternative.For these reasons, in this paper a VHDL Floating Point Module is presented. This proposal allows the design and execution of floating point algorithms with acceptable occupancy to be implemented in FPGAs/ASICs qualified for space environments.

  2. Fragments of Science: Festschrift for Mendel Sachs

    NASA Astrophysics Data System (ADS)

    Ram, Michael

    1999-11-01

    The Table of Contents for the full book PDF is as follows: * Preface * Sketches at a Symposium * For Mendel Sachs * The Constancy of an Angular Point of View * Information-Theoretic Logic and Transformation-Theoretic Logic * The Invention of the Transistor and the Realization of the Hole * Mach's Principle, Newtonian Gravitation, Absolute Space, and Einstein * The Sun, Our Variable Star * The Inconstant Sun: Symbiosis of Time Variations of Sunspots, Atmospheric Radiocarbon, Aurorae, and Tree Ring Growth * Other Worlds * Super-Classical Quantum Mechanics * A Probabilistic Approach to the Phase Problem of X-Ray Crystallography * A Nonlinear Twist on Inertia Gives Unified Electroweak Gravitation * Neutrino Oscillations * On an Incompleteness in the General-Relativistic Description of Gravitation * All Truth is One * Ideas of Physics: Correspondence between Colleagues * The Influence of the Physics and Philosophy of Einstein's Relativity on My Attitudes in Science: An Autobiography

  3. A Mode of Combined ERP and KMS Knowledge Management System Construction

    NASA Astrophysics Data System (ADS)

    Yuena, Kang; Yangeng, Wen; Qun, Zhou

    The core of ERP and knowledge management is quite similar; both will send appropriate knowledge (goods, funds) to the right people (position) at the right time. It is reasonable to believe that increase the knowledge management system in ERP will help companies achieve their goals better. This paper compares the concept of logical point of hall three-dimensional structure of the knowledge management system and the ERP in methodology level. And found they are very similar in the time dimension, logic dimension and knowledge dimension. This laid the basis of methodology in the simultaneous planning, implementation and applications. And then proposed a knowledge-based ERP Multi-Agent Management System Model. Finally, the paper described the process from planning to implementation of knowledge management ERP system with multi-Agent interaction and impact from three concepts, management thinking, software and system.

  4. Measure Landscape Diversity with Logical Scout Agents

    NASA Astrophysics Data System (ADS)

    Wirth, E.; Szabó, G.; Czinkóczky, A.

    2016-06-01

    The Common Agricultural Policy reform of the EU focuses on three long-term objectives: viable food production, sustainable management of natural resources and climate action with balanced territorial development. To achieve these goals, the EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity. Current paper introduces an agent-based method to calculate the potential of landscape diversity. The method tries to catch the nature of heterogeneity using logic and modelling as opposed to the traditional statistical reasoning. The outlined Random Walk Scouting algorithm registers the land cover crossings of the scout agents to a Monte Carlo integral. The potential is proportional with the composition and the configuration (spatial character) of the landscape. Based on the measured points a potential map is derived to give an objective and quantitative basis to the stakeholders (policy makers, farmers).

  5. ac propulsion system for an electric vehicle

    NASA Technical Reports Server (NTRS)

    Geppert, S.

    1980-01-01

    It is pointed out that dc drives will be the logical choice for current production electric vehicles (EV). However, by the mid-80's, there is a good chance that the price and reliability of suitable high-power semiconductors will allow for a competitive ac system. The driving force behind the ac approach is the induction motor, which has specific advantages relative to a dc shunt or series traction motor. These advantages would be an important factor in the case of a vehicle for which low maintenance characteristics are of primary importance. A description of an EV ac propulsion system is provided, taking into account the logic controller, the inverter, the motor, and a two-speed transmission-differential-axle assembly. The main barrier to the employment of the considered propulsion system in EV is not any technical problem, but inverter transistor cost.

  6. Clocked Magnetostriction-Assisted Spintronic Device Design and Simulation

    NASA Astrophysics Data System (ADS)

    Mousavi Iraei, Rouhollah; Kani, Nickvash; Dutta, Sourav; Nikonov, Dmitri E.; Manipatruni, Sasikanth; Young, Ian A.; Heron, John T.; Naeemi, Azad

    2018-05-01

    We propose a heterostructure device comprised of magnets and piezoelectrics that significantly improves the delay and the energy dissipation of an all-spin logic (ASL) device. This paper studies and models the physics of the device, illustrates its operation, and benchmarks its performance using SPICE simulations. We show that the proposed device maintains low voltage operation, non-reciprocity, non-volatility, cascadability, and thermal reliability of the original ASL device. Moreover, by utilizing the deterministic switching of a magnet from the saddle point of the energy profile, the device is more efficient in terms of energy and delay and is robust to thermal fluctuations. The results of simulations show that compared to ASL devices, the proposed device achieves 21x shorter delay and 27x lower energy dissipation per bit for a 32-bit arithmetic-logic unit (ALU).

  7. Combinatorial and High Throughput Discovery of High Temperature Piezoelectric Ceramics

    DTIC Science & Technology

    2011-10-10

    the known candidate piezoelectric ferroelectric perovskites. Unlike most computational studies on crystal chemistry, where the starting point is some...studies on crystal chemistry, where the starting point is some form of electronic structure calculation, we use a data driven approach to initiate our...experimental measurements reported in the literature. Given that our models are based solely on crystal and electronic structure data and did not

  8. Molecules from Space and the Origin of Life

    NASA Technical Reports Server (NTRS)

    Bernstein Max P.; Sandford, Scott A.; Allamandola, Louis J.; DeVincenzi, Donald (Technical Monitor)

    1999-01-01

    There is a growing concensus among space scientists that frozen molecules from space helped to make the Earth the pleasant place that it is today, and helped Life start on Earth, and perhaps elsewhere. The chain of logic that led scientists to posit a connection between extraterrestrial molecules and the origin of life is as follows. 1) The rapidity with which life arose demands that conditions on Earth were conducive to the formation of life very early on. 2) There is reason to believe that comets and meteorites fell oil the Earth from its inception. 3) We now know that comets and meteorites are replete with complex organic compounds, some of which resemble those in living systems. 4) Perhaps the input of molecules from comets and meteorites provided crucial constituents to the primordial soup and Jump started life on Earth. 5) These molecules formed out in deep space long before the Earth ever existed, by processes that we can reproduce in the laboratory. 6) The fact that organic molecules are seen by astronomers throughout our galaxy and in others makes it seem likely that they were (and are) available to help start life in other planetary systems.

  9. Re-starting smoking in the postpartum period after receiving a smoking cessation intervention: a systematic review.

    PubMed

    Jones, Matthew; Lewis, Sarah; Parrott, Steve; Wormall, Stephen; Coleman, Tim

    2016-06-01

    In pregnant smoking cessation trial participants, to estimate (1) among women abstinent at the end of pregnancy, the proportion who re-start smoking at time-points afterwards (primary analysis) and (2) among all trial participants, the proportion smoking at the end of pregnancy and at selected time-points during the postpartum period (secondary analysis). Trials identified from two Cochrane reviews plus searches of Medline and EMBASE. Twenty-seven trials were included. The included trials were randomized or quasi-randomized trials of within-pregnancy cessation interventions given to smokers who reported abstinence both at end of pregnancy and at one or more defined time-points after birth. Outcomes were validated biochemically and self-reported continuous abstinence from smoking and 7-day point prevalence abstinence. The primary random-effects meta-analysis used longitudinal data to estimate mean pooled proportions of re-starting smoking; a secondary analysis used cross-sectional data to estimate the mean proportions smoking at different postpartum time-points. Subgroup analyses were performed on biochemically validated abstinence. The pooled mean proportion re-starting at 6 months postpartum was 43% [95% confidence interval (CI) = 16-72%, I(2)  = 96.7%] (11 trials, 571 abstinent women). The pooled mean proportion smoking at the end of pregnancy was 87% (95% CI = 84-90%, I(2)  = 93.2%) and 94% (95% CI = 92-96%, I(2)  = 88%) at 6 months postpartum (23 trials, 9262 trial participants). Findings were similar when using biochemically validated abstinence. In clinical trials of smoking cessation interventions during pregnancy only 13% are abstinent at term. Of these, 43% re-start by 6 months postpartum. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  10. Optimal Analysis of Left Atrial Strain by Speckle Tracking Echocardiography: P-wave versus R-wave Trigger.

    PubMed

    Hayashi, Shuji; Yamada, Hirotsugu; Bando, Mika; Saijo, Yoshihito; Nishio, Susumu; Hirata, Yukina; Klein, Allan L; Sata, Masataka

    2015-08-01

    Left atrial (LA) strain analysis using speckle tracking echocardiography is useful for assessing LA function. However, there is no established procedure for this method. Most investigators have determined the electrocardiographic R-wave peak as the starting point for LA strain analysis. To test our hypothesis that P-wave onset should be used as the starting point, we measured LA strain using 2 different starting points and compared the strain values with the corresponding LA volume indices obtained by three-dimensional (3D) echocardiography. We enrolled 78 subjects (61 ± 17 years, 25 males) with and without various cardiac diseases in this study and assessed global longitudinal LA strain by two-dimensional speckle tracking strain echocardiography using EchoPac software. We used either R-wave peak or P-wave onset as the starting point for determining LA strains during the reservoir (Rres, Pres), conduit (Rcon, Pcon), and booster pump (Rpump, Ppump) phases. We determined the maximum, minimum, and preatrial contraction LA volumes, and calculated the LA total, passive, and active emptying fractions using 3D echocardiography. The correlation between Pres and LA total emptying fraction was better than the correlation between Rres and LA total emptying fraction (r = 0.458 vs. 0.308, P = 0.026). Pcon and Ppump exhibited better correlation with the corresponding 3D echocardiographic parameters than Rcon (r = 0.560 vs. 0.479, P = 0.133) and Rpump (r = 0.577 vs. 0.345, P = 0.003), respectively. LA strain in any phase should be analyzed using P-wave onset as the starting point rather than R-wave peak. © 2014, Wiley Periodicals, Inc.

  11. ADS's Dexter Data Extraction Applet

    NASA Astrophysics Data System (ADS)

    Demleitner, M.; Accomazzi, A.; Eichhorn, G.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    The NASA Astrophysics Data System (ADS) now holds 1.3 million scanned pages, containing numerous plots and figures for which the original data sets are lost or inaccessible. The availability of scans of the figures can significantly ease the regeneration of the data sets. For this purpose, the ADS has developed Dexter, a Java applet that supports the user in this process. Dexter's basic functionality is to let the user manually digitize a plot by marking points and defining the coordinate transformation from the logical to the physical coordinate system. Advanced features include automatic identification of axes, tracing lines and finding points matching a template. This contribution both describes the operation of Dexter from a user's point of view and discusses some of the architectural issues we faced during implementation.

  12. Adiabatic Edge Channel Transport in a Nanowire Quantum Point Contact Register.

    PubMed

    Heedt, S; Manolescu, A; Nemnes, G A; Prost, W; Schubert, J; Grützmacher, D; Schäpers, Th

    2016-07-13

    We report on a prototype device geometry where a number of quantum point contacts are connected in series in a single quasi-ballistic InAs nanowire. At finite magnetic field the backscattering length is increased up to the micron-scale and the quantum point contacts are connected adiabatically. Hence, several input gates can control the outcome of a ballistic logic operation. The absence of backscattering is explained in terms of selective population of spatially separated edge channels. Evidence is provided by regular Aharonov-Bohm-type conductance oscillations in transverse magnetic fields, in agreement with magnetoconductance calculations. The observation of the Shubnikov-de Haas effect at large magnetic fields corroborates the existence of spatially separated edge channels and provides a new means for nanowire characterization.

  13. Automatic derivation of natural and artificial lineaments from ALS point clouds in floodplains

    NASA Astrophysics Data System (ADS)

    Mandlburger, G.; Briese, C.

    2009-04-01

    Water flow is one of the most important driving forces in geomorphology and river systems have ever since formed our landscapes. With increasing urbanisation fertile flood plains were more and more cultivated and the defence of valuable settlement areas by dikes and dams became an important issue. Today, we are dealing with landscapes built up by natural as well as man-made artificial forces. In either case the general shape of the terrain can be portrayed by lineaments representing discontinuities of the terrain slope. Our contribution, therefore, presents an automatic method for delineating natural and artificial structure lines based on randomly distributed point data with high density of more than one point/m2. Preferably, the last echoes of airborne laser scanning (ALS) point clouds are used, since the laser signal is able to penetrate vegetation through small gaps in the foliage. Alternatively, point clouds from (multi) image matching can be employed, but poor ground point coverage in vegetated areas is often the limiting factor. Our approach is divided into three main steps: First, potential 2D start segments are detected by analyzing the surface curvature in the vicinity of each data point, second, the detailed 3D progression of each structure line is modelled patch-wise by intersecting surface pairs (e.g. planar patch pairs) based on the detected start segments and by performing line growing and, finally, post-processing like line cleaning, smoothing and networking is carried out in a last step. For the initial detection of start segments a best fitting two dimensional polynomial surface (quadric) is computed in each data point based on a set of neighbouring points, from which the minimum and maximum curvature is derived. Patches showing high maximum and low minimum curvatures indicate linear discontinuities in the surface slope and serve as start segments for the subsequent 3D modelling. Based on the 2D location and orientation of the start segments, surface patches can be identified as to the left or the right of the structure line. For each patch pair the intersection line is determined by least squares adjustment. The stochastic model considers the planimetric accuracy of the start segments, and the vertical measurement errors in the data points. A robust estimation approach is embedded in the patch adjustment for elimination of off-terrain ALS last echo points. Starting from an initial patch pair, structure line modelling is continued in forward and backward direction as long as certain thresholds (e.g. minimum surface intersection angles) are fulfilled. In the final post-processing step the resulting line set is cleaned by connecting corresponding line parts, by removing short line strings of minor relevance, and by thinning the resulting line set with respect to a certain approximation tolerance in order to reduce the amount of line data. Thus, interactive human verification and editing is limited to a minimum. In a real-world example structure lines were computed for a section of the river Main (ALS, last echoes, 4 points/m2) demonstrating the high potential of the proposed method with respect to accuracy and completeness. Terrestrial control measurements have confirmed the high accuracy expectations both in planimetry (<0.4m) and height (<0.2m).

  14. Inertial Pointing and Positioning System

    NASA Technical Reports Server (NTRS)

    Yee, Robert (Inventor); Robbins, Fred (Inventor)

    1998-01-01

    An inertial pointing and control system and method for pointing to a designated target with known coordinates from a platform to provide accurate position, steering, and command information. The system continuously receives GPS signals and corrects Inertial Navigation System (INS) dead reckoning or drift errors. An INS is mounted directly on a pointing instrument rather than in a remote location on the platform for-monitoring the terrestrial position and instrument attitude. and for pointing the instrument at designated celestial targets or ground based landmarks. As a result. the pointing instrument and die INS move independently in inertial space from the platform since the INS is decoupled from the platform. Another important characteristic of the present system is that selected INS measurements are combined with predefined coordinate transformation equations and control logic algorithms under computer control in order to generate inertial pointing commands to the pointing instrument. More specifically. the computer calculates the desired instrument angles (Phi, Theta. Psi). which are then compared to the Euler angles measured by the instrument- mounted INS. and forms the pointing command error angles as a result of the compared difference.

  15. Talking about the City: Focus Group Discussions about the City and the Community as Developmental Grounds with Children Aged 5-17

    ERIC Educational Resources Information Center

    Lúcio, Joana

    2015-01-01

    Due to its complexity, size, diversity (internal or external) and meanings, it is possible to analyse the city from various points of view, which are, ultimately, references for the construction of knowledge about the urban space, and the logic of apprehension and appropriation used by individuals and organizations in relation to the place they…

  16. Cyber OODA: A Candidate Model for Cyberspace Engagement

    DTIC Science & Technology

    2010-06-01

    of physical and syntactic limits, or viewing a content-rich PowerPoint presentation on a blackberry (viewable at low resolution, slow speeds, and...21 This is an ideal type definition. VPNs tunnel through traditional networks, but do not exchange information other than travel...instructions. As long as the VPN tunnel remains secure, it is treated as a separate cyberspace. If security breaks down logical cyberspaces will

  17. ICASE Semiannual Report, 1 April 1990 - 30 September 1990

    DTIC Science & Technology

    1990-11-01

    underlies parallel simulation protocols that synchronize based on logical time (all known approaches). This framework describes a suf- ficient set of...conducted primarily by visiting scientists from universities and from industry, who have resident appointments for limited periods of time , and by consultants...wave equation with point sources and semireflecting impedance boundary conditions. For sources that are piece- wise polynomial in time we get a finite

  18. A review of blepharochalasis and other causes of the lax, wrinkled eyelid.

    PubMed

    Held, J L; Schneiderman, P

    1990-02-01

    Cosmetically unappealing lax, wrinkled eyelid skin may result from various processes including connective tissue diseases, natural aging, and blepharochalasis. Since the end-stage eyelid changes due to several different processes are similar, the presence or absence of prior chronic or recurrent eyelid edema is an important differentiating point. We review blepharochalasis and provide a logical approach to its differential diagnosis.

  19. Dialoguing from a Fixed Point: How Aristotle and Pope Francis Illuminate the Promise--and Limits--of Inclusion in Catholic Higher Education

    ERIC Educational Resources Information Center

    Petrusek, Matthew Richard

    2017-01-01

    This article examines the meaning of the word "inclusion" as it relates to Catholic identity in higher education. Noting the widespread presence of this value in the mission statements of Catholic colleges, the article draws on insights from Aristotelian logic and Pope Francis's theology of encounter to argue that inclusion can only be…

  20. Synthetic alienation of microbial organisms by using genetic code engineering: Why and how?

    PubMed

    Kubyshkin, Vladimir; Budisa, Nediljko

    2017-08-01

    The main goal of synthetic biology (SB) is the creation of biodiversity applicable for biotechnological needs, while xenobiology (XB) aims to expand the framework of natural chemistries with the non-natural building blocks in living cells to accomplish artificial biodiversity. Protein and proteome engineering, which overcome limitation of the canonical amino acid repertoire of 20 (+2) prescribed by the genetic code by using non-canonic amino acids (ncAAs), is one of the main focuses of XB research. Ideally, estranging the genetic code from its current form via systematic introduction of ncAAs should enable the development of bio-containment mechanisms in synthetic cells potentially endowing them with a "genetic firewall" i.e. orthogonality which prevents genetic information transfer to natural systems. Despite rapid progress over the past two decades, it is not yet possible to completely alienate an organism that would use and maintain different genetic code associations permanently. In order to engineer robust bio-contained life forms, the chemical logic behind the amino acid repertoire establishment should be considered. Starting from recent proposal of Hartman and Smith about the genetic code establishment in the RNA world, here the authors mapped possible biotechnological invasion points for engineering of bio-contained synthetic cells equipped with non-canonical functionalities. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. The Spatial Vision Tree: A Generic Pattern Recognition Engine- Scientific Foundations, Design Principles, and Preliminary Tree Design

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.

    2010-01-01

    New foundational ideas are used to define a novel approach to generic visual pattern recognition. These ideas proceed from the starting point of the intrinsic equivalence of noise reduction and pattern recognition when noise reduction is taken to its theoretical limit of explicit matched filtering. This led us to think of the logical extension of sparse coding using basis function transforms for both de-noising and pattern recognition to the full pattern specificity of a lexicon of matched filter pattern templates. A key hypothesis is that such a lexicon can be constructed and is, in fact, a generic visual alphabet of spatial vision. Hence it provides a tractable solution for the design of a generic pattern recognition engine. Here we present the key scientific ideas, the basic design principles which emerge from these ideas, and a preliminary design of the Spatial Vision Tree (SVT). The latter is based upon a cryptographic approach whereby we measure a large aggregate estimate of the frequency of occurrence (FOO) for each pattern. These distributions are employed together with Hamming distance criteria to design a two-tier tree. Then using information theory, these same FOO distributions are used to define a precise method for pattern representation. Finally the experimental performance of the preliminary SVT on computer generated test images and complex natural images is assessed.

  2. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    USGS Publications Warehouse

    Counihan, Timothy D.; Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers.

  3. Effect of compatibilizing agents on the interface and mechanical behaviour of polypropylene/hemp bast fiber biocomposites

    NASA Astrophysics Data System (ADS)

    Boruvka, M.; Lenfeld, P.; Brdlik, P.; Behalek, L.

    2015-07-01

    During the last years automotive industry has given a lot of attention to the biobased polymers that are sustainable and eco-friendly. Nevertheless fully green composites are currently too expensive for most applications. A viable solution and logical starting point at this material revolution lies in reinforced synthetic thermoplastics based on plant derived biodegradable fibers. Plant fibers (PF's) have potential to reduce weight of composite vehicle parts up to 40% compared with the main automotive composites filler, glass fibers (GF's). Production of GF's composites is much more energy intensive and polluting compared with growing, harvesting and preparing of PF's. The main disadvantage of PF's lies in combination of non-polar hydrophobic polymer matrix and polar hydrophilic fibers. This combination creates poor interface with low adhesion of both components. That implies poor wettability of fibres by polymer matrix and low mechanical properties of biocomposites. Therefore specific compatibilizing agents (Struktol SA1012, Fusabond P353, Smart + Luperox) were used in order to enhance compatibility between reinforcement and matrix. In this paper sets of biocomposite compounds were prepared by twin screw extrusion considering different type and weight percentage (wt. %) of compatibilizing agents, hemp bast fibres (HBF's) within ratio 20 (wt. %) and polypropylene (PP) THERMOFIL PP E020M matrix. Resulting compounds were than injection molded and tested samples were characterized by means of scanning electron microscopy (SEM) and mechanical testing.

  4. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    PubMed Central

    Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian S.; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers. PMID:29364953

  5. Can Moral Hazard Be Resolved by Common-Knowledge in S4n-Knowledge?

    NASA Astrophysics Data System (ADS)

    Matsuhisa, Takashi

    This article investigates the relationship between common-knowledge and agreement in multi-agent system, and to apply the agreement result by common-knowledge to the principal-agent model under non-partition information. We treat the two problems: (1) how we capture the fact that the agents agree on an event or they get consensus on it from epistemic point of view, and (2) how the agreement theorem will be able to make progress to settle a moral hazard problem in the principal-agents model under non-partition information. We shall propose a solution program for the moral hazard in the principal-agents model under non-partition information by common-knowledge. Let us start that the agents have the knowledge structure induced from a reflexive and transitive relation associated with the multi-modal logic S4n. Each agent obtains the membership value of an event under his/her private information, so he/she considers the event as fuzzy set. Specifically consider the situation that the agents commonly know all membership values of the other agents. In this circumstance we shall show the agreement theorem that consensus on the membership values among all agents can still be guaranteed. Furthermore, under certain assumptions we shall show that the moral hazard can be resolved in the principal-agent model when all the expected marginal costs are common-knowledge among the principal and agents.

  6. Errant life, molecular biology, and biopower: Canguilhem, Jacob, and Foucault.

    PubMed

    Talcott, Samuel

    2014-01-01

    This paper considers the theoretical circumstances that urged Michel Foucault to analyse modern societies in terms of biopower. Georges Canguilhem's account of the relations between science and the living forms an essential starting point for Foucault's own later explorations, though the challenges posed by the molecular revolution in biology and François Jacob's history of it allowed Foucault to extend and transform Canguilhem's philosophy of error. Using archival research into his 1955-1956 course on "Science and Error," I show that, for Canguilhem, it is inauthentic to treat a living being as an error, even if living things are capable of making errors in the domain of knowledge. The emergent molecular biology in the 1960s posed a grave challenge, however, since it suggested that individuals could indeed be errors of genetic reproduction. The paper discusses how Canguilhem and Foucault each responded to this by examining, among other texts, their respective reviews of Jacob's The Logic of the Living. For Canguilhem this was an opportunity to reaffirm the creativity of life in the living individual, which is not a thing to be evaluated, but the source of values. For Foucault, drawing on Jacob's work, this was the opportunity to develop a transformed account of valuation by posing biopower as the DNA of society. Despite their disagreements, the paper examines these three authors as different iterations of a historical epistemology attuned to errancy, error, and experimentation.

  7. Deterministic seismic hazard macrozonation of India

    NASA Astrophysics Data System (ADS)

    Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.

    2012-10-01

    Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.

  8. 33 CFR 165.704 - Safety Zone; Tampa Bay, Florida.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... safety zone starts at Tampa Bay Cut “F” Channel from Lighted Buoys “3F” and “4F” and proceeds north ending at Gadsden Point Cut Lighted Buoys “3” and “4”. The safety zone starts again at Gadsden Point Cut Lighted Buoys “7” and “8” and proceeds north through Hillsborough Cut “C”, Port Sutton Entrance Channel...

  9. [Disputes and history of fetal heart monitoring].

    PubMed

    Dueñas-García, Omar Felipe; Díaz-Sotomayor, Maricela

    2011-01-01

    The concept of fetal heart monitoring to determine the fetal wellbeing state has been employed for almost 300 years, but in the last 50 years it has observed drastic changes due to the incorporation of the electronic devices that has started controversy since the moment of its description and point of start. The purpose of this article is to mention the key points and controversial moments in the history of the cardiotocography

  10. Starting Points and Destinations: Negotiating Factual and Fictional Pathways: A Response to Gilbourne, Jones and Jordan

    ERIC Educational Resources Information Center

    Wellard, Ian

    2014-01-01

    This paper provides a response to questions which emerged when reading Gilbourne et al's paper, questions it is suggested which compel us to go back to the very heart of what critical social science is (or can be) about. Central to this debate is the extent to which a perceived starting point in any investigation has implications upon the…

  11. Breakout Reconnection Observed by the TESIS EUV Telescope

    NASA Astrophysics Data System (ADS)

    Reva, A. A.; Ulyanov, A. S.; Shestov, S. V.; Kuzin, S. V.

    2016-01-01

    We present experimental evidence of the coronal mass ejection (CME) breakout reconnection, observed by the TESIS EUV telescope. The telescope could observe solar corona up to 2 R⊙ from the Sun center in the Fe 171 Å line. Starting from 2009 April 8, TESIS observed an active region (AR) that had a quadrupolar structure with an X-point 0.5 R⊙ above photosphere. A magnetic field reconstructed from the Michelson Doppler Imager data also has a multipolar structure with an X-point above the AR. At 21:45 UT on April 9, the loops near the X-point started to move away from each other with a velocity of ≈7 km s-1. At 01:15 UT on April 10, a bright stripe appeared between the loops, and the flux in the GOES 0.5-4 Å channel increased. We interpret the loops’ sideways motion and the bright stripe as evidence of the breakout reconnection. At 01:45 UT, the loops below the X-point started to slowly move up. At 15:10 UT, the CME started to accelerate impulsively, while at the same time a flare arcade formed below the CME. After 15:50 UT, the CME moved with constant velocity. The CME evolution precisely followed the breakout model scenario.

  12. PowerPoint Workshop for Teachers[TM].

    ERIC Educational Resources Information Center

    Caughlin, Janet

    This guide for teachers to the Microsoft PowerPoint multimedia presentation program begins with a section that introduces what PowerPoint is and why teachers should use it, Windows 95/98 basics, Macintosh basics, getting started, PowerPoint toolbars, and presentation tips. The next section discusses learning PowerPoint, including creating a…

  13. A fuzzy Petri-net-based mode identification algorithm for fault diagnosis of complex systems

    NASA Astrophysics Data System (ADS)

    Propes, Nicholas C.; Vachtsevanos, George

    2003-08-01

    Complex dynamical systems such as aircraft, manufacturing systems, chillers, motor vehicles, submarines, etc. exhibit continuous and event-driven dynamics. These systems undergo several discrete operating modes from startup to shutdown. For example, a certain shipboard system may be operating at half load or full load or may be at start-up or shutdown. Of particular interest are extreme or "shock" operating conditions, which tend to severely impact fault diagnosis or the progression of a fault leading to a failure. Fault conditions are strongly dependent on the operating mode. Therefore, it is essential that in any diagnostic/prognostic architecture, the operating mode be identified as accurately as possible so that such functions as feature extraction, diagnostics, prognostics, etc. can be correlated with the predominant operating conditions. This paper introduces a mode identification methodology that incorporates both time- and event-driven information about the process. A fuzzy Petri net is used to represent the possible successive mode transitions and to detect events from processed sensor signals signifying a mode change. The operating mode is initialized and verified by analysis of the time-driven dynamics through a fuzzy logic classifier. An evidence combiner module is used to combine the results from both the fuzzy Petri net and the fuzzy logic classifier to determine the mode. Unlike most event-driven mode identifiers, this architecture will provide automatic mode initialization through the fuzzy logic classifier and robustness through the combining of evidence of the two algorithms. The mode identification methodology is applied to an AC Plant typically found as a component of a shipboard system.

  14. Learning a Markov Logic network for supervised gene regulatory network inference

    PubMed Central

    2013-01-01

    Background Gene regulatory network inference remains a challenging problem in systems biology despite the numerous approaches that have been proposed. When substantial knowledge on a gene regulatory network is already available, supervised network inference is appropriate. Such a method builds a binary classifier able to assign a class (Regulation/No regulation) to an ordered pair of genes. Once learnt, the pairwise classifier can be used to predict new regulations. In this work, we explore the framework of Markov Logic Networks (MLN) that combine features of probabilistic graphical models with the expressivity of first-order logic rules. Results We propose to learn a Markov Logic network, e.g. a set of weighted rules that conclude on the predicate “regulates”, starting from a known gene regulatory network involved in the switch proliferation/differentiation of keratinocyte cells, a set of experimental transcriptomic data and various descriptions of genes all encoded into first-order logic. As training data are unbalanced, we use asymmetric bagging to learn a set of MLNs. The prediction of a new regulation can then be obtained by averaging predictions of individual MLNs. As a side contribution, we propose three in silico tests to assess the performance of any pairwise classifier in various network inference tasks on real datasets. A first test consists of measuring the average performance on balanced edge prediction problem; a second one deals with the ability of the classifier, once enhanced by asymmetric bagging, to update a given network. Finally our main result concerns a third test that measures the ability of the method to predict regulations with a new set of genes. As expected, MLN, when provided with only numerical discretized gene expression data, does not perform as well as a pairwise SVM in terms of AUPR. However, when a more complete description of gene properties is provided by heterogeneous sources, MLN achieves the same performance as a black-box model such as a pairwise SVM while providing relevant insights on the predictions. Conclusions The numerical studies show that MLN achieves very good predictive performance while opening the door to some interpretability of the decisions. Besides the ability to suggest new regulations, such an approach allows to cross-validate experimental data with existing knowledge. PMID:24028533

  15. Learning a Markov Logic network for supervised gene regulatory network inference.

    PubMed

    Brouard, Céline; Vrain, Christel; Dubois, Julie; Castel, David; Debily, Marie-Anne; d'Alché-Buc, Florence

    2013-09-12

    Gene regulatory network inference remains a challenging problem in systems biology despite the numerous approaches that have been proposed. When substantial knowledge on a gene regulatory network is already available, supervised network inference is appropriate. Such a method builds a binary classifier able to assign a class (Regulation/No regulation) to an ordered pair of genes. Once learnt, the pairwise classifier can be used to predict new regulations. In this work, we explore the framework of Markov Logic Networks (MLN) that combine features of probabilistic graphical models with the expressivity of first-order logic rules. We propose to learn a Markov Logic network, e.g. a set of weighted rules that conclude on the predicate "regulates", starting from a known gene regulatory network involved in the switch proliferation/differentiation of keratinocyte cells, a set of experimental transcriptomic data and various descriptions of genes all encoded into first-order logic. As training data are unbalanced, we use asymmetric bagging to learn a set of MLNs. The prediction of a new regulation can then be obtained by averaging predictions of individual MLNs. As a side contribution, we propose three in silico tests to assess the performance of any pairwise classifier in various network inference tasks on real datasets. A first test consists of measuring the average performance on balanced edge prediction problem; a second one deals with the ability of the classifier, once enhanced by asymmetric bagging, to update a given network. Finally our main result concerns a third test that measures the ability of the method to predict regulations with a new set of genes. As expected, MLN, when provided with only numerical discretized gene expression data, does not perform as well as a pairwise SVM in terms of AUPR. However, when a more complete description of gene properties is provided by heterogeneous sources, MLN achieves the same performance as a black-box model such as a pairwise SVM while providing relevant insights on the predictions. The numerical studies show that MLN achieves very good predictive performance while opening the door to some interpretability of the decisions. Besides the ability to suggest new regulations, such an approach allows to cross-validate experimental data with existing knowledge.

  16. A new design approach to innovative spectrometers. Case study: TROPOLITE

    NASA Astrophysics Data System (ADS)

    Volatier, Jean-Baptiste; Baümer, Stefan; Kruizinga, Bob; Vink, Rob

    2014-05-01

    Designing a novel optical system is a nested iterative process. The optimization loop, from a starting point to final system is already mostly automated. However this loop is part of a wider loop which is not. This wider loop starts with an optical specification and ends with a manufacturability assessment. When designing a new spectrometer with emphasis on weight and cost, numerous iterations between the optical- and mechanical designer are inevitable. The optical designer must then be able to reliably produce optical designs based on new input gained from multidisciplinary studies. This paper presents a procedure that can automatically generate new starting points based on any kind of input or new constraint that might arise. These starting points can then be handed over to a generic optimization routine to make the design tasks extremely efficient. The optical designer job is then not to design optical systems, but to meta-design a procedure that produces optical systems paving the way for system level optimization. We present here this procedure and its application to the design of TROPOLITE a lightweight push broom imaging spectrometer.

  17. Promoting Children's Social-Emotional Skills in Preschool Can Enhance Academic and Behavioral Functioning in Kindergarten: Findings from Head Start REDI

    PubMed Central

    Nix, Robert L.; Bierman, Karen L.; Domitrovich, Celene E.; Gill, Sukhdeep

    2013-01-01

    This study examined processes of change associated with the positive preschool and kindergarten outcomes of children who received the Head Start REDI intervention, compared to “usual practice” Head Start. In a large-scale randomized-controlled trial (N = 356 children, 42% African American or Latino, all from low-income families), this study tests the logic model that improving preschool social-emotional skills (e.g., emotion understanding, social problem solving, and positive social behavior) as well as language/emergent literacy skills will promote cross-domain academic and behavioral adjustment after children transition into kindergarten. Validating this logic model, the present study finds that intervention effects on three important kindergarten outcomes (e.g., reading achievement, learning engagement, and positive social behavior) were mediated by preschool gains in the proximal social-emotional and language/emergent literacy skills targeted by the REDI intervention. Importantly, preschool gains in social-emotional skills made unique contributions to kindergarten outcomes in reading achievement and learning engagement, even after accounting for the concurrent preschool gains in vocabulary and emergent literacy skills. These findings highlight the importance of fostering at-risk children's social-emotional skills during preschool as a means of promoting school readiness. The REDI (Research-Based, Developmentally-Informed) enrichment intervention was designed to complement and strengthen the impact of existing Head Start programs in the dual domains of language/emergent literacy skills and social-emotional competencies. REDI was one of several projects funded by the Interagency School Readiness Consortium, a partnership of four federal agencies (the National Institute of Child Health and Human Development, the Administration for Children and Families, the Assistant Secretary for Planning and Evaluation in the Department of Health and Human Services, and the Office of Special Education and Rehabilitation Services in the Department of Education). The projects funded through this partnership were designed to assess how integrative early interventions for at-risk children could promote learning and development across multiple domains of functioning. In addition, the projects were charged with examining processes of change and identifying mechanisms of action by which the early childhood interventions fostered later school adjustment and academic achievement. This study examined such processes of change, with the goal of documenting hypothesized cross-domain influences on kindergarten outcomes. In particular, this study tested whether gains in the proximal language/emergent literacy and social-emotional competencies targeted during Head Start would mediate the REDI intervention effects on kindergarten academic and behavioral outcomes. In addition, it tested the hypothesis that gains in social-emotional competencies during preschool would make unique contributions to intervention effects on both academic and behavioral outcomes, even after accounting for the effects of preschool gains in language and emergent literacy skills. PMID:24311939

  18. Oral desensitization to milk: how to choose the starting dose!

    PubMed Central

    Mori, Francesca; Pucci, Neri; Rossi, Maria Elisabetta; de Martino, Maurizio; Azzari, Chiara; Novembre, Elio

    2010-01-01

    Mori F, Pucci N, Rossi ME, de Martino M, Azzari C, Novembre E. Oral desensitization to milk: how to choose the starting dose! Pediatr Allergy Immunol 2010: 21: e450–e453. © 2009 John Wiley & Sons A/S A renewed interest in oral desensitization as treatment for food allergy has been observed in the last few years. We studied a novel method based on the end point skin prick test procedure to establish the starting dose for oral desensitization in a group of 30 children higly allergic to milk. The results (in terms of reactions to the first dose administered) were compared with a group of 20 children allergic to milk as well. Such control group started to swallow the same dose of 0.015 mg/ml of milk. None reacted to the first dose when administered according to the end point skin prick test. On the other side, ten out of 20 children (50%) from the control group showed mild allergic reactions to the first dose of milk. In conclusion the end point skin prick test procedure results safe and easy to be performed in each single child in order to find out the starting dose for oral desensitization to milk, also by taking into account the individual variability. PMID:19624618

  19. Spin torque oscillator neuroanalog of von Neumann's microwave computer.

    PubMed

    Hoppensteadt, Frank

    2015-10-01

    Frequency and phase of neural activity play important roles in the behaving brain. The emerging understanding of these roles has been informed by the design of analog devices that have been important to neuroscience, among them the neuroanalog computer developed by O. Schmitt and A. Hodgkin in the 1930s. Later J. von Neumann, in a search for high performance computing using microwaves, invented a logic machine based on crystal diodes that can perform logic functions including binary arithmetic. Described here is an embodiment of his machine using nano-magnetics. Electrical currents through point contacts on a ferromagnetic thin film can create oscillations in the magnetization of the film. Under natural conditions these properties of a ferromagnetic thin film may be described by a nonlinear Schrödinger equation for the film's magnetization. Radiating solutions of this system are referred to as spin waves, and communication within the film may be by spin waves or by directed graphs of electrical connections. It is shown here how to formulate a STO logic machine, and by computer simulation how this machine can perform several computations simultaneously using multiplexing of inputs, that this system can evaluate iterated logic functions, and that spin waves may communicate frequency, phase and binary information. Neural tissue and the Schmitt-Hodgkin, von Neumann and STO devices share a common bifurcation structure, although these systems operate on vastly different space and time scales; namely, all may exhibit Andronov-Hopf bifurcations. This suggests that neural circuits may be capable of the computational functionality as described by von Neumann. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. 40 CFR 86.535-90 - Dynamometer procedure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... run consists of two tests, a “cold” start test and a “hot” start test following the “cold” start by 10... Administrator. (d) Practice runs over the prescribed driving schedule may be performed at test points, provided... the proper speed-time relationship, or to permit sampling system adjustments. (e) The drive wheel...

  1. [Fault, sacrifice and traumatic neurosis. Apropos of a case].

    PubMed

    Clervoy, P; Lebigot, F

    1994-04-01

    The present case is one of a patient suffering from traumatic neurosis, on whom psychotherapeutic work has been possible. The traumatic experience, which dates back to the Franco-Algerian War, could have gradually started making sense in the context of its onset and of the state in which the relationship between the patient and his father remained at the time. Something along the lines of an irrevocable mistake, irrevocable since logically referred to as a serious moral mistake on the part of the father towards his son, renders humanely comprehensible his upkeep of his atrocity as represented by the repetition of the revival symptoms.

  2. Off to a good start: the influence of pre- and periconceptional exposures, parental fertility, and nutrition on children's health.

    PubMed

    Chapin, Robert E; Robbins, Wendie A; Schieve, Laura A; Sweeney, Anne M; Tabacova, Sonia A; Tomashek, Kay M

    2004-01-01

    The scientific community is developing a compelling body of evidence that shows the importance of the in utero environment (including chemical and hormonal levels) to the ultimate health of the child and even of the aging adult. This article summarizes the evidence that shows this impact begins with conception. Only a full life-cycle evaluation will help us understand these impacts, and only such an understanding will produce logically prioritized mitigation strategies to address the greatest threats first. Clearly, the time for analysis begins when the next generation is but a twinkle in the eye.

  3. Off to a good start: the influence of pre- and periconceptional exposures, parental fertility, and nutrition on children's health.

    PubMed Central

    Chapin, Robert E; Robbins, Wendie A; Schieve, Laura A; Sweeney, Anne M; Tabacova, Sonia A; Tomashek, Kay M

    2004-01-01

    The scientific community is developing a compelling body of evidence that shows the importance of the in utero environment (including chemical and hormonal levels) to the ultimate health of the child and even of the aging adult. This article summarizes the evidence that shows this impact begins with conception. Only a full life-cycle evaluation will help us understand these impacts, and only such an understanding will produce logically prioritized mitigation strategies to address the greatest threats first. Clearly, the time for analysis begins when the next generation is but a twinkle in the eye. PMID:14698934

  4. Causal cognition in human and nonhuman animals: a comparative, critical review.

    PubMed

    Penn, Derek C; Povinelli, Daniel J

    2007-01-01

    In this article, we review some of the most provocative experimental results to have emerged from comparative labs in the past few years, starting with research focusing on contingency learning and finishing with experiments exploring nonhuman animals' understanding of causal-logical relations. Although the theoretical explanation for these results is often inchoate, a clear pattern nevertheless emerges. The comparative evidence does not fit comfortably into either the traditional associationist or inferential alternatives that have dominated comparative debate for many decades now. Indeed, the similarities and differences between human and nonhuman causal cognition seem to be much more multifarious than these dichotomous alternatives allow.

  5. Mapping the transcription start points of the Staphylococcus aureus eap, emp, and vwb promoters reveals a conserved octanucleotide sequence that is essential for expression of these genes.

    PubMed

    Harraghy, Niamh; Homerova, Dagmar; Herrmann, Mathias; Kormanec, Jan

    2008-01-01

    Mapping the transcription start points of the eap, emp, and vwb promoters revealed a conserved octanucleotide sequence (COS). Deleting this sequence abolished the expression of eap, emp, and vwb. However, electrophoretic mobility shift assays gave no evidence that this sequence was a binding site for SarA or SaeR, known regulators of eap and emp.

  6. Investigations of magnesium, histamine and immunoglobulins dynamics in acute urticaria.

    PubMed

    Mureşan, D; Oană, A; Nicolae, I; Alecu, M; Moşescu, L; Benea, V; Flueraş, M

    1990-01-01

    In 42 urticaria patients, magnesium, histamine and IgE were dosed. Magnesium, IgE and histamine variations were followed in urticaria evolution, during acute phase and clinical remission. We noticed magnesium, histamine, IgE values variations depending on disease evolution and applied therapeutic scheme. Therefore: At disease starting point, histamine presented 3.5 times higher values than the normal ones. The value decreases following a curve which tends to reach normal values during clinical remission. At disease starting point, magnesium presented values under the inferior limit of the normal, 0.5 m mol/L respectively, as a mean. The value increases towards the normal limit during clinical remission. Immunoglobulins E follow a similar curve to histamine one, presenting 1,250 U/L values at the starting point, that, under medication, influence decrease between normal limits (800 U/L), during clinical remission. Analyzing the variations of biochemical parameters, the authors emphasize magnesium substitution treatment in urticaria.

  7. Performance characteristics of a nanoscale double-gate reconfigurable array

    NASA Astrophysics Data System (ADS)

    Beckett, Paul

    2008-12-01

    The double gate transistor is a promising device applicable to deep sub-micron design due to its inherent resistance to short-channel effects and superior subthreshold performance. Using both TCAD and SPICE circuit simulation, it is shown that the characteristics of fully depleted dual-gate thin-body Schottky barrier silicon transistors will not only uncouple the conflicting requirements of high performance and low standby power in digital logic, but will also allow the development of a locally-connected reconfigurable computing mesh. The magnitude of the threshold shift effect will scale with device dimensions and will remain compatible with oxide reliability constraints. A field-programmable architecture based on the double gate transistor is described in which the operating point of the circuit is biased via one gate while the other gate is used to form the logic array, such that complex heterogeneous computing functions may be developed from this homogeneous, mesh-connected organization.

  8. Can Quantum-Mechanical Description of Physical Reality Be Considered Correct?

    NASA Astrophysics Data System (ADS)

    Brassard, Gilles; Méthot, André Allan

    2010-04-01

    In an earlier paper written in loving memory of Asher Peres, we gave a critical analysis of the celebrated 1935 paper in which Einstein, Podolsky and Rosen (EPR) challenged the completeness of quantum mechanics. There, we had pointed out logical shortcomings in the EPR paper. Now, we raise additional questions concerning their suggested program to find a theory that would “provide a complete description of the physical reality”. In particular, we investigate the extent to which the EPR argumentation could have lead to the more dramatic conclusion that quantum mechanics is in fact incorrect. With this in mind, we propose a speculation, made necessary by a logical shortcoming in the EPR paper caused by the lack of a necessary condition for “elements of reality”, and surmise that an eventually complete theory would either be inconsistent with quantum mechanics, or would at least violate Heisenberg’s Uncertainty Principle.

  9. Causal structure of oscillations in gene regulatory networks: Boolean analysis of ordinary differential equation attractors.

    PubMed

    Sun, Mengyang; Cheng, Xianrui; Socolar, Joshua E S

    2013-06-01

    A common approach to the modeling of gene regulatory networks is to represent activating or repressing interactions using ordinary differential equations for target gene concentrations that include Hill function dependences on regulator gene concentrations. An alternative formulation represents the same interactions using Boolean logic with time delays associated with each network link. We consider the attractors that emerge from the two types of models in the case of a simple but nontrivial network: a figure-8 network with one positive and one negative feedback loop. We show that the different modeling approaches give rise to the same qualitative set of attractors with the exception of a possible fixed point in the ordinary differential equation model in which concentrations sit at intermediate values. The properties of the attractors are most easily understood from the Boolean perspective, suggesting that time-delay Boolean modeling is a useful tool for understanding the logic of regulatory networks.

  10. The 'locus' of health oversight in Brazil's Unified Health System - a place between the knowledge and the practices of social mobilization.

    PubMed

    Fernandes, Valcler Rangel; Luz, Zélia Profeta da; Amorim, Annibal Coelho de; Sérgio, Juraci Vieira; Silva, José Paulo Vicente da; Castro, Marcia Correa E; Monken, Maurício; Gondim, Grácia Maria de Miranda

    2017-10-01

    Supervision of a health system presupposes keeping an attentive eye on the health situation of populations, so as to understand health, illness and healthcare as indissociable manifestations of human existence. Taking this point of view, this article examines health practices from the basis of some of their processes of communication. These are markedly professional-centered in their logic, with their emphasis on scientific, vertical and authoritarian discourse, predominantly in the spaces of the Unified Health System (SUS). In the territory, the process of communication is determinant. As a result of social interaction in daily life, the communication process reterritorializes the elements of the social totality: people, companies, institutions are re-dimensioned in the logic. It is a characteristic space for activities that aim for a more horizontal and democratic flow of communication.

  11. Overcoming thermal noise in non-volatile spin wave logic.

    PubMed

    Dutta, Sourav; Nikonov, Dmitri E; Manipatruni, Sasikanth; Young, Ian A; Naeemi, Azad

    2017-05-15

    Spin waves are propagating disturbances in magnetically ordered materials, analogous to lattice waves in solid systems and are often described from a quasiparticle point of view as magnons. The attractive advantages of Joule-heat-free transmission of information, utilization of the phase of the wave as an additional degree of freedom and lower footprint area compared to conventional charge-based devices have made spin waves or magnon spintronics a promising candidate for beyond-CMOS wave-based computation. However, any practical realization of an all-magnon based computing system must undergo the essential steps of a careful selection of materials and demonstrate robustness with respect to thermal noise or variability. Here, we aim at identifying suitable materials and theoretically demonstrate the possibility of achieving error-free clocked non-volatile spin wave logic device, even in the presence of thermal noise and clock jitter or clock skew.

  12. Engineering, technology and science disciplines and gender difference: a case study among Indian students

    NASA Astrophysics Data System (ADS)

    Cheruvalath, Reena

    2018-01-01

    It is proposed to examine the argument that females cannot perform better in engineering and science fields because of their poor mathematical or logical reasoning. The major reason for the reduced number of females in the above fields in India is the socio-cultural aversion towards females choosing the field and restriction in providing higher education for them by their parents. The present study shows that the females who get the opportunity to study engineering and science perform equal to or better than their male counterparts. An analysis of CGPA (Cumulative Grade Point Average) of 2631 students who have completed their engineering or science programme in one of the top engineering colleges in India for five years shows that female academic performance is equal to or better than that of males. Mathematical, logical, verbal and mechanical reasoning are tested while calculating CGPA.

  13. Orbital Express fluid transfer demonstration system

    NASA Astrophysics Data System (ADS)

    Rotenberger, Scott; SooHoo, David; Abraham, Gabriel

    2008-04-01

    Propellant resupply of orbiting spacecraft is no longer in the realm of high risk development. The recently concluded Orbital Express (OE) mission included a fluid transfer demonstration that operated the hardware and control logic in space, bringing the Technology Readiness Level to a solid TRL 7 (demonstration of a system prototype in an operational environment). Orbital Express (funded by the Defense Advanced Research Projects Agency, DARPA) was launched aboard an Atlas-V rocket on March 9th, 2007. The mission had the objective of demonstrating technologies needed for routine servicing of spacecraft, namely autonomous rendezvous and docking, propellant resupply, and orbital replacement unit transfer. The demonstration system used two spacecraft. A servicing vehicle (ASTRO) performed multiple dockings with the client (NextSat) spacecraft, and performed a variety of propellant transfers in addition to exchanges of a battery and computer. The fluid transfer and propulsion system onboard ASTRO, in addition to providing the six degree-of-freedom (6 DOF) thruster system for rendezvous and docking, demonstrated autonomous transfer of monopropellant hydrazine to or from the NextSat spacecraft 15 times while on orbit. The fluid transfer system aboard the NextSat vehicle was designed to simulate a variety of client systems, including both blowdown pressurization and pressure regulated propulsion systems. The fluid transfer demonstrations started with a low level of autonomy, where ground controllers were allowed to review the status of the demonstration at numerous points before authorizing the next steps to be performed. The final transfers were performed at a full autonomy level where the ground authorized the start of a transfer sequence and then monitored data as the transfer proceeded. The major steps of a fluid transfer included the following: mate of the coupling, leak check of the coupling, venting of the coupling, priming of the coupling, fluid transfer, gauging of receiving tank, purging of coupling and de-mate of the coupling.

  14. eWaterCycle: A high resolution global hydrological model

    NASA Astrophysics Data System (ADS)

    van de Giesen, Nick; Bierkens, Marc; Drost, Niels; Hut, Rolf; Sutanudjaja, Edwin

    2014-05-01

    In 2013, the eWaterCycle project was started, which has the ambitious goal to run a high resolution global hydrological model. Starting point was the PCR-GLOBWB built by Utrecht University. The software behind this model will partially be re-engineered in order to enable to run it in a High Performance Computing (HPC) environment. The aim is to have a spatial resolution of 1km x 1km. The idea is also to run the model in real-time and forecasting mode, using data assimilation. An on-demand hydraulic model will be available for detailed flow and flood forecasting in support of navigation and disaster management. The project faces a set of scientific challenges. First, to enable the model to run in a HPC environment, model runs were analyzed to examine on which parts of the program most CPU time was spent. These parts were re-coded in Open MPI to allow for parallel processing. Different parallelization strategies are thinkable. In our case, it was decided to use watershed logic as a first step to distribute the analysis. There is rather limited recent experience with HPC in hydrology and there is much to be learned and adjusted, both on the hydrological modeling side and the computer science side. For example, an interesting early observation was that hydrological models are, due to their localized parameterization, much more memory intensive than models of sister-disciplines such as meteorology and oceanography. Because it would be deadly to have to swap information between CPU and hard drive, memory management becomes crucial. A standard Ensemble Kalman Filter (enKF) would, for example, have excessive memory demands. To circumvent these problems, an alternative to the enKF was developed that produces equivalent results. This presentation shows the most recent results from the model, including a 5km x 5km simulation and a proof of concept for the new data assimilation approach. Finally, some early ideas about financial sustainability of an operational global hydrological model are presented.

  15. Synthesis of capillary pressure curves from post-stack seismic data with the use of intelligent estimators: A case study from the Iranian part of the South Pars gas field, Persian Gulf Basin

    NASA Astrophysics Data System (ADS)

    Golsanami, Naser; Kadkhodaie-Ilkhchi, Ali; Erfani, Amir

    2015-01-01

    Capillary pressure curves are important data for reservoir rock typing, analyzing pore throat distribution, determining height above free water level, and reservoir simulation. Laboratory experiments provide accurate data, however they are expensive, time-consuming and discontinuous through the reservoir intervals. The current study focuses on synthesizing artificial capillary pressure (Pc) curves from seismic attributes with the use of artificial intelligent systems including Artificial Neural Networks (ANNs), Fuzzy logic (FL) and Adaptive Neuro-Fuzzy Inference Systems (ANFISs). The synthetic capillary pressure curves were achieved by estimating pressure values at six mercury saturation points. These points correspond to mercury filled pore volumes of core samples (Hg-saturation) at 5%, 20%, 35%, 65%, 80%, and 90% saturations. To predict the synthetic Pc curve at each saturation point, various FL, ANFIS and ANN models were constructed. The varying neural network models differ in their training algorithm. Based on the performance function, the most accurately functioning models were selected as the final solvers to do the prediction process at each of the above-mentioned mercury saturation points. The constructed models were then tested at six depth points of the studied well which were already unforeseen by the models. The results show that the Fuzzy logic and neuro-fuzzy models were not capable of making reliable estimations, while the predictions from the ANN models were satisfyingly trustworthy. The obtained results showed a good agreement between the laboratory derived and synthetic capillary pressure curves. Finally, a 3D seismic cube was captured for which the required attributes were extracted and the capillary pressure cube was estimated by using the developed models. In the next step, the synthesized Pc cube was compared with the seismic cube and an acceptable correspondence was observed.

  16. On Fixed Points of Strictly Causal Functions

    DTIC Science & Technology

    2013-04-08

    were defined to be the functions that are strictly contracting with respect to the Cantor metric (also called the Baire distance) on signals over non...in Computer Science, pages 447–484. Springer Berlin / Heidelberg, 1992. [36] George Markowsky. Chain-complete posets and directed sets with...Journal of Logic Programming, 42(2):59–70, 2000. [53] George M. Reed and A. William Roscoe. A timed model for communicating sequential processes. In

  17. Operational Design: The Art of Framing the Solution

    DTIC Science & Technology

    2010-04-01

    of moral or physical strength, power, and resistance — what Clausewitz called ‘the hub of all power and movement, on which everything depends…the... physical lines of operation to create decisive points. Connecting the dots already examined leads to an operational design construct that is...are identified, they should be oriented along physical or logical lines of operation. o Defining Tasks. Once lines of operation are developed

  18. Models in biology: ‘accurate descriptions of our pathetic thinking’

    PubMed Central

    2014-01-01

    In this essay I will sketch some ideas for how to think about models in biology. I will begin by trying to dispel the myth that quantitative modeling is somehow foreign to biology. I will then point out the distinction between forward and reverse modeling and focus thereafter on the former. Instead of going into mathematical technicalities about different varieties of models, I will focus on their logical structure, in terms of assumptions and conclusions. A model is a logical machine for deducing the latter from the former. If the model is correct, then, if you believe its assumptions, you must, as a matter of logic, also believe its conclusions. This leads to consideration of the assumptions underlying models. If these are based on fundamental physical laws, then it may be reasonable to treat the model as ‘predictive’, in the sense that it is not subject to falsification and we can rely on its conclusions. However, at the molecular level, models are more often derived from phenomenology and guesswork. In this case, the model is a test of its assumptions and must be falsifiable. I will discuss three models from this perspective, each of which yields biological insights, and this will lead to some guidelines for prospective model builders. PMID:24886484

  19. Critical Analysis of the Mathematical Formalism of Theoretical Physics. V. Foundations of the Theory of Negative Numbers

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    Analysis of the foundations of the theory of negative numbers is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. Statement of the problem is as follows. As is known, point O in the Cartesian coordinate system XOY determines the position of zero on the scale. The number ``zero'' belongs to both the scale of positive numbers and the scale of negative numbers. In this case, the following formallogical contradiction arises: the number 0 is both positive number and negative number; or, equivalently, the number 0 is neither positive number nor negative number, i.e. number 0 has no sign. Then the following question arises: Do negative numbers exist in science and practice? A detailed analysis of the problem shows that negative numbers do not exist because the foundations of the theory of negative numbers contrary to the formal-logical laws. It is proved that: (a) all numbers have no signs; (b) the concepts ``negative number'' and ``negative sign of number'' represent a formallogical error; (c) signs ``plus'' and ``minus'' are only symbols of mathematical operations. The logical errors determine the essence of the theory of negative numbers: the theory of negative number is a false theory.

  20. Data Automata in Scala

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2014-01-01

    The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.

  1. An autonomous molecular computer for logical control of gene expression.

    PubMed

    Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud

    2004-05-27

    Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems. Recently, simple molecular-scale autonomous programmable computers were demonstrated allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for 'logical' control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug.

  2. Logic integration of mRNA signals by an RNAi-based molecular computer.

    PubMed

    Xie, Zhen; Liu, Siyuan John; Bleris, Leonidas; Benenson, Yaakov

    2010-05-01

    Synthetic in vivo molecular 'computers' could rewire biological processes by establishing programmable, non-native pathways between molecular signals and biological responses. Multiple molecular computer prototypes have been shown to work in simple buffered solutions. Many of those prototypes were made of DNA strands and performed computations using cycles of annealing-digestion or strand displacement. We have previously introduced RNA interference (RNAi)-based computing as a way of implementing complex molecular logic in vivo. Because it also relies on nucleic acids for its operation, RNAi computing could benefit from the tools developed for DNA systems. However, these tools must be harnessed to produce bioactive components and be adapted for harsh operating environments that reflect in vivo conditions. In a step toward this goal, we report the construction and implementation of biosensors that 'transduce' mRNA levels into bioactive, small interfering RNA molecules via RNA strand exchange in a cell-free Drosophila embryo lysate, a step beyond simple buffered environments. We further integrate the sensors with our RNAi 'computational' module to evaluate two-input logic functions on mRNA concentrations. Our results show how RNA strand exchange can expand the utility of RNAi computing and point toward the possibility of using strand exchange in a native biological setting.

  3. Logic integration of mRNA signals by an RNAi-based molecular computer

    PubMed Central

    Xie, Zhen; Liu, Siyuan John; Bleris, Leonidas; Benenson, Yaakov

    2010-01-01

    Synthetic in vivo molecular ‘computers’ could rewire biological processes by establishing programmable, non-native pathways between molecular signals and biological responses. Multiple molecular computer prototypes have been shown to work in simple buffered solutions. Many of those prototypes were made of DNA strands and performed computations using cycles of annealing-digestion or strand displacement. We have previously introduced RNA interference (RNAi)-based computing as a way of implementing complex molecular logic in vivo. Because it also relies on nucleic acids for its operation, RNAi computing could benefit from the tools developed for DNA systems. However, these tools must be harnessed to produce bioactive components and be adapted for harsh operating environments that reflect in vivo conditions. In a step toward this goal, we report the construction and implementation of biosensors that ‘transduce’ mRNA levels into bioactive, small interfering RNA molecules via RNA strand exchange in a cell-free Drosophila embryo lysate, a step beyond simple buffered environments. We further integrate the sensors with our RNAi ‘computational’ module to evaluate two-input logic functions on mRNA concentrations. Our results show how RNA strand exchange can expand the utility of RNAi computing and point toward the possibility of using strand exchange in a native biological setting. PMID:20194121

  4. The Military Assistance Command-Vietnam Studies and Observations Group-A Case Study in Special Operations Campaigning

    DTIC Science & Technology

    2016-06-10

    viewed as the panacea for all military problems. Politicians view SOF as a low risk minimalist investment that produces results; even for problems...of published work that has been dedicated to discussing special operations theory as an element of military strategy. A good starting point to...utility. Doctrine As a starting point for framing understanding of special operations, Joint Publication (JP) 3-05 Special Operations, provides the basis

  5. Dynamic Event Tree advancements and control logic improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less

  6. Alternative Attitude Commanding and Control for Precise Spacecraft Landing

    NASA Technical Reports Server (NTRS)

    Singh, Gurkirpal

    2004-01-01

    A report proposes an alternative method of control for precision landing on a remote planet. In the traditional method, the attitude of a spacecraft is required to track a commanded translational acceleration vector, which is generated at each time step by solving a two-point boundary value problem. No requirement of continuity is imposed on the acceleration. The translational acceleration does not necessarily vary smoothly. Tracking of a non-smooth acceleration causes the vehicle attitude to exhibit undesirable transients and poor pointing stability behavior. In the alternative method, the two-point boundary value problem is not solved at each time step. A smooth reference position profile is computed. The profile is recomputed only when the control errors get sufficiently large. The nominal attitude is still required to track the smooth reference acceleration command. A steering logic is proposed that controls the position and velocity errors about the reference profile by perturbing the attitude slightly about the nominal attitude. The overall pointing behavior is therefore smooth, greatly reducing the degree of pointing instability.

  7. Minimum energy dissipation required for a logically irreversible operation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Yoshikawa, Nobuyuki

    2018-01-01

    According to Landauer's principle, the minimum heat emission required for computing is linked to logical entropy, or logical reversibility. The validity of Landauer's principle has been investigated for several decades and was finally demonstrated in recent experiments by showing that the minimum heat emission is associated with the reduction in logical entropy during a logically irreversible operation. Although the relationship between minimum heat emission and logical reversibility is being revealed, it is not clear how much free energy is required to be dissipated for a logically irreversible operation. In the present study, in order to reveal the connection between logical reversibility and free energy dissipation, we numerically demonstrated logically irreversible protocols using adiabatic superconductor logic. The calculation results of work during the protocol showed that, while the minimum heat emission conforms to Landauer's principle, the free energy dissipation can be arbitrarily reduced by performing the protocol quasistatically. The above results show that logical reversibility is not associated with thermodynamic reversibility, and that heat is not only emitted from logic devices but also absorbed by logic devices. We also formulated the heat emission from adiabatic superconductor logic during a logically irreversible operation at a finite operation speed.

  8. Investigation of the influence of sampling schemes on quantitative dynamic fluorescence imaging

    PubMed Central

    Dai, Yunpeng; Chen, Xueli; Yin, Jipeng; Wang, Guodong; Wang, Bo; Zhan, Yonghua; Nie, Yongzhan; Wu, Kaichun; Liang, Jimin

    2018-01-01

    Dynamic optical data from a series of sampling intervals can be used for quantitative analysis to obtain meaningful kinetic parameters of probe in vivo. The sampling schemes may affect the quantification results of dynamic fluorescence imaging. Here, we investigate the influence of different sampling schemes on the quantification of binding potential (BP) with theoretically simulated and experimentally measured data. Three groups of sampling schemes are investigated including the sampling starting point, sampling sparsity, and sampling uniformity. In the investigation of the influence of the sampling starting point, we further summarize two cases by considering the missing timing sequence between the probe injection and sampling starting time. Results show that the mean value of BP exhibits an obvious growth trend with an increase in the delay of the sampling starting point, and has a strong correlation with the sampling sparsity. The growth trend is much more obvious if throwing the missing timing sequence. The standard deviation of BP is inversely related to the sampling sparsity, and independent of the sampling uniformity and the delay of sampling starting time. Moreover, the mean value of BP obtained by uniform sampling is significantly higher than that by using the non-uniform sampling. Our results collectively suggest that a suitable sampling scheme can help compartmental modeling of dynamic fluorescence imaging provide more accurate results and simpler operations. PMID:29675325

  9. Drafting guidelines for occupational exposure to chemicals: the Dutch experience with the assessment of reproductive risks.

    PubMed

    Stijkel, A; van Eijndhoven, J C; Bal, R

    1996-12-01

    The Dutch procedure for standard setting for occupational exposure to chemicals, just like the European Union (EU) procedure, is characterized by an organizational separation between considerations of health on the one side, and of technology, economics, and policy on the other side. Health considerations form the basis for numerical guidelines. These guidelines are next combined with technical-economical considerations. Standards are then proposed, and are finally set by the Ministry of Social Affairs and Employment. An analysis of this procedure might be of relevance to the US, where other procedures are used and criticized. In this article we focus on the first stage of the standard-setting procedure. In this stage, the Dutch Expert Committee on Occupational Standards (DECOS) drafts a criteria document in which a health-based guideline is proposed. The drafting is based on a set of starting points for assessing toxicity. We raise the questions, "Does DECOS limit itself only to health considerations? And if not, what are the consequences of such a situation?" We discuss DECOS' starting points and analyze the relationships between those starting points, and then explore eight criteria documents where DECOS was considering reproductive risks as a possible critical effect. For various reasons, it will be concluded that the starting points leave much interpretative space, and that this space is widened further by the manner in which DECOS utilizes it. This is especially true in situations involving sex-specific risks and uncertainties in knowledge. Consequently, even at the first stage, where health considerations alone are intended to play a role, there is much room for other than health-related factors to influence decision making, although it is unavoidable that some interpretative space will remain. We argue that separating the various types of consideration should not be abandoned. Rather, through adjustments in the starting points and aspects of the procedure, clarity should be guaranteed about the way the interpretative space is being employed.

  10. Method and Apparatus for Simultaneous Processing of Multiple Functions

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Andrei, Radu (Inventor)

    2017-01-01

    Electronic logic gates that operate using N logic state levels, where N is greater than 2, and methods of operating such gates. The electronic logic gates operate according to truth tables. At least two input signals each having a logic state that can range over more than two logic states are provided to the logic gates. The logic gates each provide an output signal that can have one of N logic states. Examples of gates described include NAND/NAND gates having two inputs A and B and NAND/NAND gates having three inputs A, B, and C, where A, B and C can take any of four logic states. Systems using such gates are described, and their operation illustrated. Optical logic gates that operate using N logic state levels are also described.

  11. Method and Apparatus for Simultaneous Processing of Multiple Functions

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Andrei, Radu (Inventor); Zhu, David (Inventor); Mojarradi, Mohammad Mehdi (Inventor); Vo, Tuan A. (Inventor)

    2015-01-01

    Electronic logic gates that operate using N logic state levels, where N is greater than 2, and methods of operating such gates. The electronic logic gates operate according to truth tables. At least two input signals each having a logic state that can range over more than two logic states are provided to the logic gates. The logic gates each provide an output signal that can have one of N logic states. Examples of gates described include NAND/NAND gates having two inputs A and B and NAND/NAND gates having three inputs A, B, and C, where A, B and C can take any of four logic states. Systems using such gates are described, and their operation illustrated. Optical logic gates that operate using N logic state levels are also described.

  12. Using Abductive Research Logic: "The Logic of Discovery", to Construct a Rigorous Explanation of Amorphous Evaluation Findings

    ERIC Educational Resources Information Center

    Levin-Rozalis, Miri

    2010-01-01

    Background: Two kinds of research logic prevail in scientific research: deductive research logic and inductive research logic. However, both fail in the field of evaluation, especially evaluation conducted in unfamiliar environments. Purpose: In this article I wish to suggest the application of a research logic--"abduction"--"the logic of…

  13. Application of linear logic to simulation

    NASA Astrophysics Data System (ADS)

    Clarke, Thomas L.

    1998-08-01

    Linear logic, since its introduction by Girard in 1987 has proven expressive and powerful. Linear logic has provided natural encodings of Turing machines, Petri nets and other computational models. Linear logic is also capable of naturally modeling resource dependent aspects of reasoning. The distinguishing characteristic of linear logic is that it accounts for resources; two instances of the same variable are considered differently from a single instance. Linear logic thus must obey a form of the linear superposition principle. A proportion can be reasoned with only once, unless a special operator is applied. Informally, linear logic distinguishes two kinds of conjunction, two kinds of disjunction, and also introduces a modal storage operator that explicitly indicates propositions that can be reused. This paper discuses the application of linear logic to simulation. A wide variety of logics have been developed; in addition to classical logic, there are fuzzy logics, affine logics, quantum logics, etc. All of these have found application in simulations of one sort or another. The special characteristics of linear logic and its benefits for simulation will be discussed. Of particular interest is a connection that can be made between linear logic and simulated dynamics by using the concept of Lie algebras and Lie groups. Lie groups provide the connection between the exponential modal storage operators of linear logic and the eigen functions of dynamic differential operators. Particularly suggestive are possible relations between complexity result for linear logic and non-computability results for dynamical systems.

  14. Simple Tidal Prism Models Revisited

    NASA Astrophysics Data System (ADS)

    Luketina, D.

    1998-01-01

    Simple tidal prism models for well-mixed estuaries have been in use for some time and are discussed in most text books on estuaries. The appeal of this model is its simplicity. However, there are several flaws in the logic behind the model. These flaws are pointed out and a more theoretically correct simple tidal prism model is derived. In doing so, it is made clear which effects can, in theory, be neglected and which can not.

  15. Crossing Guards: A Safety Patrol Program at a Residential School for Students Who Are Blind or Visually Impaired. Practice Report

    ERIC Educational Resources Information Center

    Besden, Cheryl; Crow, Nita; Delgado Greenberg, Maya; Finkelstein, Gerri; Shrieves, Gary; Vickroy, Marcia

    2005-01-01

    In 2001, the California School for the Blind (CSB) was faced with a dilemma. The dropoff point for the day buses had to be changed. The new route to the only logical location for this change sent the buses through a driveway where residential students crossed to travel between the school and the dormitories. Some staff members wanted to eliminate…

  16. The Fixed-Point Theory of Strictly Causal Functions

    DTIC Science & Technology

    2013-06-09

    functions were defined to be the functions that are strictly contracting with respect to the Cantor metric (also called the Baire distance) on signals...of Lecture Notes in Computer Science, pages 447–484. Springer Berlin / Heidelberg, 1992. [36] George Markowsky. Chain-complete posets and directed...Journal of Logic Programming, 42(2):59–70, 2000. [52] George M. Reed and A. William Roscoe. A timed model for communicating sequential processes. In Laurent

  17. An RNAi-Enhanced Logic Circuit for Cancer Specific Detection and Destruction

    DTIC Science & Technology

    2013-02-01

    monomeric protein secreted by Corynebacterium diphtheriae, and pro-apoptotic members of Bcl-2 family: mBax (Mus musculus), hBax ( Homo sapiens ), and its...Gata3 mStaple. Intron- feature sequences – donor site, branch point, poly- pyrimidine tract, and acceptor site – were selected based on previously...sequences found in literature our intron features were chosen according SplicePort [4], an online analyzer that detects the likelihood of splicing to

  18. The Hidden Dimension of Strategic Planning: Explorations in the Formation of Perspectives

    DTIC Science & Technology

    1991-09-01

    13 2. Laws--Or Points Of Reference?.........18 B. THE HORIZONTAL LEVEL OF DECISION - MAKING . . . . 23 1. KNOWLEDGE, RATIONALITY , AND... decision - making is a horizontal level ranging from logic and rationalism to subjective emotionalism. This is the dimension of decision - making with which...the process of decision - making . The basis of game theory is the dual premises of rationality and maximization of utility.6 "It [game theory] is

  19. Timing Is Everything: One Teacher's Exploration of the Best Time to Use Visual Media in a Science Unit

    ERIC Educational Resources Information Center

    Drury, Debra

    2006-01-01

    Kids today are growing up with televisions, movies, videos and DVDs, so it's logical to assume that this type of media could be motivating and used to great effect in the classroom. But at what point should film and other visual media be used? Are there times in the inquiry process when showing a film or incorporating other visual media is more…

  20. A Three-Perspective Theory of Cyber Sovereignty

    DTIC Science & Technology

    2017-12-21

    logic, one-way thinking, and viewing problems from a single perspective. When seeing things from one point of view , while ignoring the other two ...by state sovereignty) and multi -party governance modes. In fact, the two modes do not conflict; they have different appli- cability in different...focusing only on one’s own interests, each actor ignores the interests of the other two , resulting in the current situation in which each sticks to its

  1. Advanced Cooling for High Power Electric Actuators

    DTIC Science & Technology

    1993-01-01

    heat and heat transfer rates. At point B, the fluid temperature reaches the melting temperature of the PCM and it starts to melt , storing energy in the...working fluid through the duty cycle represented by the square wave in the upper half of the figure. Starting at point A, the actuator goes to peak load...form of latent heat. As the solid material melts , the coolant temperature continues to rise, but at a much lower rate, as the heat conducts through the

  2. The role of the optimization process in illumination design

    NASA Astrophysics Data System (ADS)

    Gauvin, Michael A.; Jacobsen, David; Byrne, David J.

    2015-07-01

    This paper examines the role of the optimization process in illumination design. We will discuss why the starting point of the optimization process is crucial to a better design and why it is also important that the user understands the basic design problem and implements the correct merit function. Both a brute force method and the Downhill Simplex method will be used to demonstrate optimization methods with focus on using interactive design tools to create better starting points to streamline the optimization process.

  3. MOE vs. M&E: considering the difference between measuring strategic effectiveness and monitoring tactical evaluation.

    PubMed

    Diehl, Glen; Major, Solomon

    2015-01-01

    Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  4. Cellular Automata Generalized To An Inferential System

    NASA Astrophysics Data System (ADS)

    Blower, David J.

    2007-11-01

    Stephen Wolfram popularized elementary one-dimensional cellular automata in his book, A New Kind of Science. Among many remarkable things, he proved that one of these cellular automata was a Universal Turing Machine. Such cellular automata can be interpreted in a different way by viewing them within the context of the formal manipulation rules from probability theory. Bayes's Theorem is the most famous of such formal rules. As a prelude, we recapitulate Jaynes's presentation of how probability theory generalizes classical logic using modus ponens as the canonical example. We emphasize the important conceptual standing of Boolean Algebra for the formal rules of probability manipulation and give an alternative demonstration augmenting and complementing Jaynes's derivation. We show the complementary roles played in arguments of this kind by Bayes's Theorem and joint probability tables. A good explanation for all of this is afforded by the expansion of any particular logic function via the disjunctive normal form (DNF). The DNF expansion is a useful heuristic emphasized in this exposition because such expansions point out where relevant 0s should be placed in the joint probability tables for logic functions involving any number of variables. It then becomes a straightforward exercise to rely on Boolean Algebra, Bayes's Theorem, and joint probability tables in extrapolating to Wolfram's cellular automata. Cellular automata are seen as purely deductive systems, just like classical logic, which probability theory is then able to generalize. Thus, any uncertainties which we might like to introduce into the discussion about cellular automata are handled with ease via the familiar inferential path. Most importantly, the difficult problem of predicting what cellular automata will do in the far future is treated like any inferential prediction problem.

  5. Contribution of stimulus attributes to errors in duration and distance judgments--a developmental study.

    PubMed

    Matsuda, F; Lan, W C; Tanimura, R

    1999-02-01

    In Matsuda's 1996 study, 4- to 11-yr.-old children (N = 133) watched two cars running on two parallel tracks on a CRT display and judged whether their durations and distances were equal and, if not, which was larger. In the present paper, the relative contributions of the four critical stimulus attributes (whether temporal starting points, temporal stopping points, spatial starting points, and spatial stopping points were the same or different between two cars) to the production of errors were quantitatively estimated based on the data for rates of errors obtained by Matsuda. The present analyses made it possible not only to understand numerically the findings about qualitative characteristics of the critical attributes described by Matsuda, but also to add more detailed findings about them.

  6. Malaria and global change: Insights, uncertainties and possible surprises

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, P.H.; Steel, A.

    Malaria may change with global change. Indeed, global change may affect malaria risk and malaria epidemiology. Malaria risk may change in response to a greenhouse warming; malaria epidemiology, in response to the social, economic, and political developments which a greenhouse warming may trigger. To date, malaria receptivity and epidemiology futures have been explored within the context of equilibrium studies. Equilibrium studies of climate change postulate an equilibrium present climate (the starting point) and a doubled-carbon dioxide climate (the end point), simulate conditions in both instances, and compare the two. What happens while climate changes, i.e., between the starting point andmore » the end point, is ignored. The present paper focuses on malaria receptivity and addresses what equilibrium studies miss, namely transient malaria dynamics.« less

  7. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    ERIC Educational Resources Information Center

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  8. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  9. Binary full adder, made of fusion gates, in a subexcitable Belousov-Zhabotinsky system

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2015-09-01

    In an excitable thin-layer Belousov-Zhabotinsky (BZ) medium a localized perturbation leads to the formation of omnidirectional target or spiral waves of excitation. A subexcitable BZ medium responds to asymmetric local perturbation by producing traveling localized excitation wave-fragments, distant relatives of dissipative solitons. The size and life span of an excitation wave-fragment depend on the illumination level of the medium. Under the right conditions the wave-fragments conserve their shape and velocity vectors for extended time periods. I interpret the wave-fragments as values of Boolean variables. When two or more wave-fragments collide they annihilate or merge into a new wave-fragment. States of the logic variables, represented by the wave-fragments, are changed in the result of the collision between the wave-fragments. Thus, a logical gate is implemented. Several theoretical designs and experimental laboratory implementations of Boolean logic gates have been proposed in the past but little has been done cascading the gates into binary arithmetical circuits. I propose a unique design of a binary one-bit full adder based on a fusion gate. A fusion gate is a two-input three-output logical device which calculates the conjunction of the input variables and the conjunction of one input variable with the negation of another input variable. The gate is made of three channels: two channels cross each other at an angle, a third channel starts at the junction. The channels contain a BZ medium. When two excitation wave-fragments, traveling towards each other along input channels, collide at the junction they merge into a single wave-front traveling along the third channel. If there is just one wave-front in the input channel, the front continues its propagation undisturbed. I make a one-bit full adder by cascading two fusion gates. I show how to cascade the adder blocks into a many-bit full adder. I evaluate the feasibility of my designs by simulating the evolution of excitation in the gates and adders using the numerical integration of Oregonator equations.

  10. Effect of starting point formation on the crystallization of amorphous silicon films by flash lamp annealing

    NASA Astrophysics Data System (ADS)

    Sato, Daiki; Ohdaira, Keisuke

    2018-04-01

    We succeed in the crystallization of hydrogenated amorphous silicon (a-Si:H) films by flash lamp annealing (FLA) at a low fluence by intentionally creating starting points for the trigger of explosive crystallization (EC). We confirm that a partly thick a-Si part can induce the crystallization of a-Si films. A periodic wavy structure is observed on the surface of polycrystalline silicon (poly-Si) on and near the thick parts, which is a clear indication of the emergence of EC. Creating partly thick a-Si parts can thus be effective for the control of the starting point of crystallization by FLA and can realize the crystallization of a-Si with high reproducibility. We also compare the effects of creating thick parts at the center and along the edge of the substrates, and a thick part along the edge of the substrates leads to the initiation of crystallization at a lower fluence.

  11. Nearby Search Indekos Based Android Using A Star (A*) Algorithm

    NASA Astrophysics Data System (ADS)

    Siregar, B.; Nababan, EB; Rumahorbo, JA; Andayani, U.; Fahmi, F.

    2018-03-01

    Indekos or rented room is a temporary residence for months or years. Society of academicians who come from out of town need a temporary residence, such as Indekos or rented room during their education, teaching, or duties. They are often found difficulty in finding a Indekos because lack of information about the Indekos. Besides, new society of academicians don’t recognize the areas around the campus and desire the shortest path from Indekos to get to the campus. The problem can be solved by implementing A Star (A*) algorithm. This algorithm is one of the shortest path algorithm to a finding shortest path from campus to the Indekos application, where the faculties in the campus as the starting point of the finding. Determination of the starting point used in this study aims to allow students to determine the starting point in finding the Indekos. The mobile based application facilitates the finding anytime and anywhere. Based on the experimental results, A* algorithm can find the shortest path with 86,67% accuracy.

  12. Pass-transistor very large scale integration

    NASA Technical Reports Server (NTRS)

    Maki, Gary K. (Inventor); Bhatia, Prakash R. (Inventor)

    2004-01-01

    Logic elements are provided that permit reductions in layout size and avoidance of hazards. Such logic elements may be included in libraries of logic cells. A logical function to be implemented by the logic element is decomposed about logical variables to identify factors corresponding to combinations of the logical variables and their complements. A pass transistor network is provided for implementing the pass network function in accordance with this decomposition. The pass transistor network includes ordered arrangements of pass transistors that correspond to the combinations of variables and complements resulting from the logical decomposition. The logic elements may act as selection circuits and be integrated with memory and buffer elements.

  13. The Components of Smile Design: New York University Smile Evaluation Form Revisited, Update 2015.

    PubMed

    Calamia, John R; Wolff, Mark S

    2015-07-01

    This article updates a simple checklist of foundational knowledge in aesthetic dental concepts that allows clinicians to organize their thoughts, to record the concerns of the patient, and to map out those improvements that must be addressed. This adjunct is called a Smile Evaluation Form. Along with other adjuncts such as radiographs, study casts, and diagnostic wax-ups, the Smile Evaluation Form allows clinicians to form a conceptual visualization of the expected end point. It provides a checklist for discussions with other disciplines in the team, to provide a logical sequence of treatment with a mutually agreed-on end point. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. The computational core and fixed point organization in Boolean networks

    NASA Astrophysics Data System (ADS)

    Correale, L.; Leone, M.; Pagnani, A.; Weigt, M.; Zecchina, R.

    2006-03-01

    In this paper, we analyse large random Boolean networks in terms of a constraint satisfaction problem. We first develop an algorithmic scheme which allows us to prune simple logical cascades and underdetermined variables, returning thereby the computational core of the network. Second, we apply the cavity method to analyse the number and organization of fixed points. We find in particular a phase transition between an easy and a complex regulatory phase, the latter being characterized by the existence of an exponential number of macroscopically separated fixed point clusters. The different techniques developed are reinterpreted as algorithms for the analysis of single Boolean networks, and they are applied in the analysis of and in silico experiments on the gene regulatory networks of baker's yeast (Saccharomyces cerevisiae) and the segment-polarity genes of the fruitfly Drosophila melanogaster.

  15. Reciprocity relations in aerodynamics

    NASA Technical Reports Server (NTRS)

    Heaslet, Max A; Spreiter, John R

    1953-01-01

    Reverse flow theorems in aerodynamics are shown to be based on the same general concepts involved in many reciprocity theorems in the physical sciences. Reciprocal theorems for both steady and unsteady motion are found as a logical consequence of this approach. No restrictions on wing plan form or flight Mach number are made beyond those required in linearized compressible-flow analysis. A number of examples are listed, including general integral theorems for lifting, rolling, and pitching wings and for wings in nonuniform downwash fields. Correspondence is also established between the buildup of circulation with time of a wing starting impulsively from rest and the buildup of lift of the same wing moving in the reverse direction into a sharp-edged gust.

  16. Magnetization Ratchet in Cylindrical Nanowires.

    PubMed

    Bran, Cristina; Berganza, Eider; Fernandez-Roldan, Jose A; Palmero, Ester M; Meier, Jessica; Calle, Esther; Jaafar, Miriam; Foerster, Michael; Aballe, Lucia; Fraile Rodriguez, Arantxa; P Del Real, Rafael; Asenjo, Agustina; Chubykalo-Fesenko, Oksana; Vazquez, Manuel

    2018-05-31

    The unidirectional motion of information carriers such as domain walls in magnetic nanostrips is a key feature for many future spintronic applications based on shift registers. This magnetic ratchet effect has so far been achieved in a limited number of complex nanomagnetic structures, for example, by lithographically engineered pinning sites. Here we report on a simple remagnetization ratchet originated in the asymmetric potential from the designed increasing lengths of magnetostatically coupled ferromagnetic segments in FeCo/Cu cylindrical nanowires. The magnetization reversal in neighboring segments propagates sequentially in steps starting from the shorter segments, irrespective of the applied field direction. This natural and efficient ratchet offers alternatives for the design of three-dimensional advanced storage and logic devices.

  17. Mammalian synthetic biology: emerging medical applications

    PubMed Central

    Kis, Zoltán; Pereira, Hugo Sant'Ana; Homma, Takayuki; Pedrigi, Ryan M.; Krams, Rob

    2015-01-01

    In this review, we discuss new emerging medical applications of the rapidly evolving field of mammalian synthetic biology. We start with simple mammalian synthetic biological components and move towards more complex and therapy-oriented gene circuits. A comprehensive list of ON–OFF switches, categorized into transcriptional, post-transcriptional, translational and post-translational, is presented in the first sections. Subsequently, Boolean logic gates, synthetic mammalian oscillators and toggle switches will be described. Several synthetic gene networks are further reviewed in the medical applications section, including cancer therapy gene circuits, immuno-regulatory networks, among others. The final sections focus on the applicability of synthetic gene networks to drug discovery, drug delivery, receptor-activating gene circuits and mammalian biomanufacturing processes. PMID:25808341

  18. Implementation of an optimum profile guidance system on STOLAND

    NASA Technical Reports Server (NTRS)

    Flanagan, P. F.

    1978-01-01

    The implementation on the STOLAND airborne digital computer of an optimum profile guidance system for the augmentor wing jet STOL research aircraft is described. Major tasks were to implement the guidance and control logic to airborne computer software and to integrate the module with the existing STOLAND navigation, display, and autopilot routines. The optimum profile guidance system comprises an algorithm for synthesizing mimimum fuel trajectories for a wide range of starting positions in the terminal area and a control law for flying the aircraft automatically along the trajectory. The avionics software developed is described along with a FORTRAN program that was constructed to reflect the modular nature and algorthms implemented in the avionics software.

  19. Making Heredity Matter: Samuel Butler's Idea of Unconscious Memory.

    PubMed

    Turbil, Cristiano

    2018-03-01

    Butler's idea of evolution was developed over the publication of four books, several articles and essays between 1863 and 1890. These publications, although never achieving the success expected by Butler, proposed a psychological elaboration of evolution (robustly enforced by Lamarck's philosophy), called 'unconscious memory'. This was strongly in contrast with the materialistic approach suggested by Darwin's natural selection. Starting with a historical introduction, this paper aspires to ascertain the logic, meaning and significance of Butler's idea of 'unconscious memory' in the post-Darwinian physiological and psychological Pan-European discussion. Particular attention is devoted to demonstrating that Butler was not only a populariser of science but also an active protagonist in the late Victorian psychological debate.

  20. People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions.

    PubMed

    Nakamura, Hiroko; Kawaguchi, Jun

    2016-01-01

    Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners' feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners' feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people's intuitive interpretation of the conditional "if p then q" fits better with the conditional probability, q given p.

Top