Sample records for distributed description logics

  1. Modular Knowledge Representation and Reasoning in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Serafini, Luciano; Homola, Martin

    Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.

  2. An interval logic for higher-level temporal reasoning

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.

    1983-01-01

    Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Raedt, Hans; Katsnelson, Mikhail I.; Donker, Hylke C.

    It is shown that the Pauli equation and the concept of spin naturally emerge from logical inference applied to experiments on a charged particle under the conditions that (i) space is homogeneous (ii) the observed events are logically independent, and (iii) the observed frequency distributions are robust with respect to small changes in the conditions under which the experiment is carried out. The derivation does not take recourse to concepts of quantum theory and is based on the same principles which have already been shown to lead to e.g. the Schrödinger equation and the probability distributions of pairs of particles inmore » the singlet or triplet state. Application to Stern–Gerlach experiments with chargeless, magnetic particles, provides additional support for the thesis that quantum theory follows from logical inference applied to a well-defined class of experiments. - Highlights: • The Pauli equation is obtained through logical inference applied to robust experiments on a charged particle. • The concept of spin appears as an inference resulting from the treatment of two-valued data. • The same reasoning yields the quantum theoretical description of neutral magnetic particles. • Logical inference provides a framework to establish a bridge between objective knowledge gathered through experiments and their description in terms of concepts.« less

  4. Quantum theory as plausible reasoning applied to data obtained by robust experiments.

    PubMed

    De Raedt, H; Katsnelson, M I; Michielsen, K

    2016-05-28

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data. © 2016 The Author(s).

  5. Paraconsistent Reasoning for OWL 2

    NASA Astrophysics Data System (ADS)

    Ma, Yue; Hitzler, Pascal

    A four-valued description logic has been proposed to reason with description logic based inconsistent knowledge bases. This approach has a distinct advantage that it can be implemented by invoking classical reasoners to keep the same complexity as under the classical semantics. However, this approach has so far only been studied for the basic description logic mathcal{ALC}. In this paper, we further study how to extend the four-valued semantics to the more expressive description logic mathcal{SROIQ} which underlies the forthcoming revision of the Web Ontology Language, OWL 2, and also investigate how it fares when adapted to tractable description logics including mathcal{EL++}, DL-Lite, and Horn-DLs. We define the four-valued semantics along the same lines as for mathcal{ALC} and show that we can retain most of the desired properties.

  6. Scalable and expressive medical terminologies.

    PubMed

    Mays, E; Weida, R; Dionne, R; Laker, M; White, B; Liang, C; Oles, F J

    1996-01-01

    The K-Rep system, based on description logic, is used to represent and reason with large and expressive controlled medical terminologies. Expressive concept descriptions incorporate semantically precise definitions composed using logical operators, together with important non-semantic information such as synonyms and codes. Examples are drawn from our experience with K-Rep in modeling the InterMed laboratory terminology and also developing a large clinical terminology now in production use at Kaiser-Permanente. System-level scalability of performance is achieved through an object-oriented database system which efficiently maps persistent memory to virtual memory. Equally important is conceptual scalability-the ability to support collaborative development, organization, and visualization of a substantial terminology as it evolves over time. K-Rep addresses this need by logically completing concept definitions and automatically classifying concepts in a taxonomy via subsumption inferences. The K-Rep system includes a general-purpose GUI environment for terminology development and browsing, a custom interface for formulary term maintenance, a C+2 application program interface, and a distributed client-server mode which provides lightweight clients with efficient run-time access to K-Rep by means of a scripting language.

  7. HWDA: A coherence recognition and resolution algorithm for hybrid web data aggregation

    NASA Astrophysics Data System (ADS)

    Guo, Shuhang; Wang, Jian; Wang, Tong

    2017-09-01

    Aiming at the object confliction recognition and resolution problem for hybrid distributed data stream aggregation, a distributed data stream object coherence solution technology is proposed. Firstly, the framework was defined for the object coherence conflict recognition and resolution, named HWDA. Secondly, an object coherence recognition technology was proposed based on formal language description logic and hierarchical dependency relationship between logic rules. Thirdly, a conflict traversal recognition algorithm was proposed based on the defined dependency graph. Next, the conflict resolution technology was prompted based on resolution pattern matching including the definition of the three types of conflict, conflict resolution matching pattern and arbitration resolution method. At last, the experiment use two kinds of web test data sets to validate the effect of application utilizing the conflict recognition and resolution technology of HWDA.

  8. Users manual for Streamtube Curvature Analysis: Analytical method for predicting the pressure distribution about a nacelle at transonic speeds, volume 1

    NASA Technical Reports Server (NTRS)

    Keith, J. S.; Ferguson, D. R.; Heck, P. H.

    1972-01-01

    The computer program, Streamtube Curvature Analysis, is described for the engineering user and for the programmer. The user oriented documentation includes a description of the mathematical governing equations, their use in the solution, and the method of solution. The general logical flow of the program is outlined and detailed instructions for program usage and operation are explained. General procedures for program use and the program capabilities and limitations are described. From the standpoint of the grammar, the overlay structure of the program is described. The various storage tables are defined and their uses explained. The input and output are discussed in detail. The program listing includes numerous comments so that the logical flow within the program is easily followed. A test case showing input data and output format is included as well as an error printout description.

  9. Digital Troposcatter Performance Model

    DTIC Science & Technology

    1983-12-01

    Dist Speia DIIBUTON STATEMR AO Approved tot public relemg ** - DistributionUnlimited __________ Communications. Control and Information Systems ...for digital troposcatter communication system design is described. Propagation and modem performance *are modeled. These include Path Loss and RSL...designing digital troposcatter systems . A User’s Manual Report discusses the use of the computer program TROPO. The description of the structure and logical

  10. A retarding ion mass spectrometer for the Dynamics Explorer-1

    NASA Technical Reports Server (NTRS)

    Wright, W.

    1985-01-01

    The Retarding Ion Mass Spectrometer (RIMS) for Dynamics Explorer-1 is an instrument designed to measure the details of the thermal plasma distribution. It combines the ion temperature determining capability of the retarding potential analyzer with the compositional capabilities of the mass spectrometer and adds multiple sensor heads to sample all directions relative to the spacecraft ram direction. This manual provides a functional description of the RIMS, the instrument calibration, and a description of the commands which can be stored in the instrument logic to control its operation.

  11. Alternating-Offers Protocol for Multi-issue Bilateral Negotiation in Semantic-Enabled Marketplaces

    NASA Astrophysics Data System (ADS)

    Ragone, Azzurra; di Noia, Tommaso; di Sciascio, Eugenio; Donini, Francesco M.

    We present a semantic-based approach to multi-issue bilateral negotiation for e-commerce. We use Description Logics to model advertisements, and relations among issues as axioms in a TBox. We then introduce a logic-based alternating-offers protocol, able to handle conflicting information, that merges non-standard reasoning services in Description Logics with utility thoery to find the most suitable agreements. We illustrate and motivate the theoretical framework, the logical language, and the negotiation protocol.

  12. Formally specifying the logic of an automatic guidance controller

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1990-01-01

    The following topics are covered in viewgraph form: (1) the Penelope Project; (2) the logic of an experimental automatic guidance control system for a 737; (3) Larch/Ada specification; (4) some failures of informal description; (5) description of mode changes caused by switches; (6) intuitive description of window status (chosen vs. current); (7) design of the code; (8) and specifying the code.

  13. Interpretation of IEEE-854 floating-point standard and definition in the HOL system

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.

    1995-01-01

    The ANSI/IEEE Standard 854-1987 for floating-point arithmetic is interpreted by converting the lexical descriptions in the standard into mathematical conditional descriptions organized in tables. The standard is represented in higher-order logic within the framework of the HOL (Higher Order Logic) system. The paper is divided in two parts with the first part the interpretation and the second part the description in HOL.

  14. Eco-logical successes : January 2011

    DOT National Transportation Integrated Search

    2011-01-01

    This document identifies and explains each Eco-Logical signatory agency's strategic environmental programs, projects, and efforts that are either directly related to or share the vision set forth in Eco-Logical. A brief description of an agency's key...

  15. Minimally inconsistent reasoning in Semantic Web.

    PubMed

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.

  16. Minimally inconsistent reasoning in Semantic Web

    PubMed Central

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030

  17. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    NASA Technical Reports Server (NTRS)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  18. Weighted Description Logics Preference Formulas for Multiattribute Negotiation

    NASA Astrophysics Data System (ADS)

    Ragone, Azzurra; di Noia, Tommaso; Donini, Francesco M.; di Sciascio, Eugenio; Wellman, Michael P.

    We propose a framework to compute the utility of an agreement w.r.t a preference set in a negotiation process. In particular, we refer to preferences expressed as weighted formulas in a decidable fragment of First-order Logic and agreements expressed as a formula. We ground our framework in Description Logics (DL) endowed with disjunction, to be compliant with Semantic Web technologies. A logic based approach to preference representation allows, when a background knowledge base is exploited, to relax the often unrealistic assumption of additive independence among attributes. We provide suitable definitions of the problem and present algorithms to compute utility in our setting. We also validate our approach through an experimental evaluation.

  19. Semantic message oriented middleware for publish/subscribe networks

    NASA Astrophysics Data System (ADS)

    Li, Han; Jiang, Guofei

    2004-09-01

    The publish/subscribe paradigm of Message Oriented Middleware provides a loosely coupled communication model between distributed applications. Traditional publish/subscribe middleware uses keywords to match advertisements and subscriptions and does not support deep semantic matching. To this end, we designed and implemented a Semantic Message Oriented Middleware system to provide such capabilities for semantic description and matching. We adopted the DARPA Agent Markup Language and Ontology Inference Layer, a formal knowledge representation language for expressing sophisticated classifications and enabling automated inference, as the topic description language in our middleware system. A simple description logic inference system was implemented to handle the matching process between the subscriptions of subscribers and the advertisements of publishers. Moreover our middleware system also has a security architecture to support secure communication and user privilege control.

  20. HYTESS 2: A Hypothetical Turbofan Engine Simplified Simulation with multivariable control and sensor analytical redundancy

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.

    1986-01-01

    A hypothetical turbofan engine simplified simulation with a multivariable control and sensor failure detection, isolation, and accommodation logic (HYTESS II) is presented. The digital program, written in FORTRAN, is self-contained, efficient, realistic and easily used. Simulated engine dynamics were developed from linearized operating point models. However, essential nonlinear effects are retained. The simulation is representative of the hypothetical, low bypass ratio turbofan engine with an advanced control and failure detection logic. Included is a description of the engine dynamics, the control algorithm, and the sensor failure detection logic. Details of the simulation including block diagrams, variable descriptions, common block definitions, subroutine descriptions, and input requirements are given. Example simulation results are also presented.

  1. Describing and recognizing patterns of events in smart environments with description logic.

    PubMed

    Scalmato, Antonello; Sgorbissa, Antonio; Zaccaria, Renato

    2013-12-01

    This paper describes a system for context awareness in smart environments, which is based on an ontology expressed in description logic and implemented in OWL 2 EL, which is a subset of the Web Ontology Language that allows for reasoning in polynomial time. The approach is different from all other works in the literature since the proposed system requires only the basic reasoning mechanisms of description logic, i.e., subsumption and instance checking, without any additional external reasoning engine. Experiments performed with data collected in three different scenarios are described, i.e., the CASAS Project at Washington State University, the assisted living facility Villa Basilea in Genoa, and the Merry Porter mobile robot at the Polyclinic of Modena.

  2. Moving code - Sharing geoprocessing logic on the Web

    NASA Astrophysics Data System (ADS)

    Müller, Matthias; Bernard, Lars; Kadner, Daniel

    2013-09-01

    Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.

  3. A Fuzzy Description Logic with Automatic Object Membership Measurement

    NASA Astrophysics Data System (ADS)

    Cai, Yi; Leung, Ho-Fung

    In this paper, we propose a fuzzy description logic named f om -DL by combining the classical view in cognitive psychology and fuzzy set theory. A formal mechanism used to determine object memberships automatically in concepts is also proposed, which is lacked in previous work fuzzy description logics. In this mechanism, object membership is based on the defining properties of concept definition and properties in object description. Moreover, while previous works cannot express the qualitative measurements of an object possessing a property, we introduce two kinds of properties named N-property and L-property, which are quantitative measurements and qualitative measurements of an object possessing a property respectively. The subsumption and implication of concepts and properties are also explored in our work. We believe that it is useful to the Semantic Web community for reasoning the fuzzy membership of objects for concepts in fuzzy ontologies.

  4. Disgust and biological descriptions bias logical reasoning during legal decision-making.

    PubMed

    Capestany, Beatrice H; Harris, Lasana T

    2014-01-01

    Legal decisions often require logical reasoning about the mental states of people who perform gruesome behaviors. We use functional magnetic resonance imaging (fMRI) to examine how brain regions implicated in logical reasoning are modulated by emotion and social cognition during legal decision-making. Participants read vignettes describing crimes that elicit strong or weak disgust matched on punishment severity using the US Federal Sentencing Guidelines. An extraneous sentence at the end of each vignette described the perpetrator's personality using traits or biological language, mimicking the increased use of scientific evidence presented in courts. Behavioral results indicate that crimes weak in disgust receive significantly less punishment than the guidelines recommend. Neuroimaging results indicate that brain regions active during logical reasoning respond less to crimes weak in disgust and biological descriptions of personality, demonstrating the impact of emotion and social cognition on logical reasoning mechanisms necessary for legal decision-making.

  5. Description of the control system design for the SSF PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Kimnach, Greg L.

    1991-01-01

    The Power Management and Distribution (PMAD) DC Testbed Control System for Space Station Freedom was developed using a top down approach based on classical control system and conventional terrestrial power utilities design techniques. The design methodology includes the development of a testbed operating concept. This operating concept describes the operation of the testbed under all possible scenarios. A unique set of operating states was identified and a description of each state, along with state transitions, was generated. Each state is represented by a unique set of attributes and constraints, and its description reflects the degree of system security within which the power system is operating. Using the testbed operating states description, a functional design for the control system was developed. This functional design consists of a functional outline, a text description, and a logical flowchart for all the major control system functions. Described here are the control system design techniques, various control system functions, and the status of the design and implementation.

  6. The CSM testbed matrix processors internal logic and dataflow descriptions

    NASA Technical Reports Server (NTRS)

    Regelbrugge, Marc E.; Wright, Mary A.

    1988-01-01

    This report constitutes the final report for subtask 1 of Task 5 of NASA Contract NAS1-18444, Computational Structural Mechanics (CSM) Research. This report contains a detailed description of the coded workings of selected CSM Testbed matrix processors (i.e., TOPO, K, INV, SSOL) and of the arithmetic utility processor AUS. These processors and the current sparse matrix data structures are studied and documented. Items examined include: details of the data structures, interdependence of data structures, data-blocking logic in the data structures, processor data flow and architecture, and processor algorithmic logic flow.

  7. Applied Digital Logic Exercises Using FPGAs

    NASA Astrophysics Data System (ADS)

    Wick, Kurt

    2017-09-01

    Applied Digital Logic Exercises Using FPGAs is appropriate for anyone interested in digital logic who needs to learn how to implement it through detailed exercises with state-of-the-art digital design tools and components. The book exposes readers to combinational and sequential digital logic concepts and implements them with hands-on exercises using the Verilog Hardware Description Language (HDL) and a Field Programmable Gate Arrays (FGPA) teaching board.

  8. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  9. Documentation for the machine-readable version of a supplement to the Bright Star catalogue (Hoffleit, Saladyga and Wlasuk 1983)

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1984-01-01

    Detailed descriptions of the three files of the machine-readable catalog are given. The files of the original tape have been restructured and the data records reformatted to produce a uniform data file having a single logical record per star and homogeneous data fields. The characteristics of the tape version as it is presently being distributed from the Astronomical Data Center are given and the changes to the original tape supplied are described.

  10. CADAT network translator

    NASA Technical Reports Server (NTRS)

    Pitts, E. R.

    1981-01-01

    Program converts cell-net data into logic-gate models for use in test and simulation programs. Input consists of either Place, Route, and Fold (PRF) or Place-and-Route-in-Two-Dimensions (PR2D) layout data deck. Output consists of either Test Pattern Generator (TPG) or Logic-Simulation (LOGSIM) logic circuitry data deck. Designer needs to build only logic-gate-model circuit description since program acts as translator. Language is FORTRAN IV.

  11. FUZZY LOGIC CONTROL OF ELECTRIC MOTORS AND MOTOR DRIVES: FEASIBILITY STUDY

    EPA Science Inventory

    The report gives results of a study (part 1) of fuzzy logic motor control (FLMC). The study included: 1) reviews of existing applications of fuzzy logic, of motor operation, and of motor control; 2) a description of motor control schemes that can utilize FLMC; 3) selection of a m...

  12. Tactical Planning Workstation Software Description

    DTIC Science & Technology

    1990-09-01

    menus an application wishes to use. It is only concerned with the location of the physically displayed objects within a form. The valid form fields...WAR COLLGE Numeric 4 N 16 ASG_FULDA Logical 1 N 17 XR_FULDA Logical 1 N 18 CMP COURSE Logical 1 N 19 MINIFREQ Character 1 N 20 WORKFREQ Character 1 N

  13. Using the Abstraction Network in Complement to Description Logics for Quality Assurance in Biomedical Terminologies - A Case Study in SNOMED CT

    PubMed Central

    Wei, Duo; Bodenreider, Olivier

    2015-01-01

    Objectives To investigate errors identified in SNOMED CT by human reviewers with help from the Abstraction Network methodology and examine why they had escaped detection by the Description Logic (DL) classifier. Case study; Two examples of errors are presented in detail (one missing IS-A relation and one duplicate concept). After correction, SNOMED CT is reclassified to ensure that no new inconsistency was introduced. Conclusions DL-based auditing techniques built in terminology development environments ensure the logical consistency of the terminology. However, complementary approaches are needed for identifying and addressing other types of errors. PMID:20841848

  14. Using the abstraction network in complement to description logics for quality assurance in biomedical terminologies - a case study in SNOMED CT.

    PubMed

    Wei, Duo; Bodenreider, Olivier

    2010-01-01

    To investigate errors identified in SNOMED CT by human reviewers with help from the Abstraction Network methodology and examine why they had escaped detection by the Description Logic (DL) classifier. Case study; Two examples of errors are presented in detail (one missing IS-A relation and one duplicate concept). After correction, SNOMED CT is reclassified to ensure that no new inconsistency was introduced. DL-based auditing techniques built in terminology development environments ensure the logical consistency of the terminology. However, complementary approaches are needed for identifying and addressing other types of errors.

  15. Documentation for the machine-readable version of the Stellar Spectrophotometric Atlas, 3130 A lambda 10800 A of Gunn and Stryker (1983)

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1984-01-01

    The machine-readable version of the Atlas as it is currently being distributed from the Astronomical Data Center is described. The data were obtained with the Oke multichannel scanner on the 5-meter Hale reflector for purposes of synthesizing galaxy spectra, and the digitized Atlas contains normalized spectral energy distributions, computed colors, scan line and continuum indices for 175 selected stars covering the complete ranges of spectral type and luminosity class. The documentation includes a byte-by-byte format description, a table of the indigenous characteristics of the magnetic tape file, and a sample listing of logical records exactly as they are recorded on the tape.

  16. The evolutionary diversity of insect retinal mosaics: Common design principles and emerging molecular logic

    PubMed Central

    Wernet, Mathias F.; Perry, Michael W.; Desplan, Claude

    2015-01-01

    Independent evolution has resulted in a vast diversity of eyes. Despite the lack of a common Bauplan or ancestral structure, similar developmental strategies are used. For instance, different classes of photoreceptor cells (PRs) are distributed stochastically and/or localized in different regions of the retina. Here we focus on recent progress made towards understanding the molecular principles behind patterning retinal mosaics of insects, one of the most diverse groups of animals adapted to life on land, in the air, under water, or on the water surface. Morphological, physiological, and behavioral studies from many species provide detailed descriptions of the vast variation in retinal design and function. By integrating this knowledge with recent progress in the characterization of insect Rhodopsins as well as insight from the model organism Drosophila melanogaster, we seek to identify the molecular logic behind the adaptation of retinal mosaics to an animal’s habitat and way of life. PMID:26025917

  17. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  18. LACIE performance predictor final operational capability program description, volume 3

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.

  19. Community Digital Library Requirements for the Southern California Earthquake Center Community Modeling Environment (SCEC/CME)

    NASA Astrophysics Data System (ADS)

    Moore, R.; Faerman, M.; Minster, J.; Day, S. M.; Ely, G.

    2003-12-01

    A community digital library provides support for ingestion, organization, description, preservation, and access of digital entities. The technologies that traditionally provide these capabilities are digital libraries (ingestion, organization, description), persistent archives (preservation) and data grids (access). We present a design for the SCEC community digital library that incorporates aspects of all three systems. Multiple groups have created integrated environments that sustain large-scale scientific data collections. By examining these projects, the following stages of implementation can be identified: \\begin{itemize} Definition of semantic terms to associate with relevant information. This includes definition of uniform content descriptors to describe physical quantities relevant to the scientific discipline, and creation of concept spaces to define how the uniform content descriptors are logically related. Organization of digital entities into logical collections that make it simple to browse and manage related material. Definition of services that are used to access and manipulate material in the collection. Creation of a preservation environment for the long-term management of the collection. Each community is faced with heterogeneity that is introduced when data is distributed across multiple sites, or when multiple sets of collection semantics are used, and or when multiple scientific sub-disciplines are federated. We will present the relevant standards that simplify the implementation of the SCEC community library, the resource requirements for different types of data sets that drive the implementation, and the digital library processes that the SCEC community library will support. The SCEC community library can be viewed as the set of processing steps that are required to build the appropriate SCEC reference data sets (SCEC approved encoding format, SCEC approved descriptive metadata, SCEC approved collection organization, and SCEC managed storage location). Each digital entity that is ingested into the SCEC community library is processed and validated for conformance to SCEC standards. These steps generate provenance, descriptive, administrative, structural, and behavioral metadata. Using data grid technology, the descriptive metadata can be registered onto a logical name space that is controlled and managed by the SCEC digital library. A version of the SCEC community digital library is being implemented in the Storage Resource Broker. The SRB system provides almost all the features enumerated above. The peer-to-peer federation of metadata catalogs is planned for release in September, 2003. The SRB system is in production use in multiple projects, from high-energy physics, to astronomy, to earth systems science, to bio-informatics. The SCEC community library will be based on the definition of standard metadata attributes, the creation of logical collections within the SRB, the creation of access services, and the demonstration of a preservation environment. The use of the SRB for the SCEC digital library will sustain the expected collection size and collection capabilities.

  20. An adaptive maneuvering logic computer program for the simulation of one-to-one air-to-air combat. Volume 2: Program description

    NASA Technical Reports Server (NTRS)

    Burgin, G. H.; Owens, A. J.

    1975-01-01

    A detailed description is presented of the computer programs in order to provide an understanding of the mathematical and geometrical relationships as implemented in the programs. The individual sbbroutines and their underlying mathematical relationships are described, and the required input data and the output provided by the program are explained. The relationship of the adaptive maneuvering logic program with the program to drive the differential maneuvering simulator is discussed.

  1. Reasoning by Augmenting a Description Logic Reasoner (Phase 1)

    DTIC Science & Technology

    2006-04-28

    procedures for the guarded fragment, a fragment of FOL that in- cludes many description logics [11]. The most widely known work in this area was by Hustadt ...transformed into a FO task that uses the theory. Unlike methods such as Hustadt and Schmidt’s functional translation, this does not result in a decision...Reasoning (KR 2000), pages 285–296, 2000. [26] U. Hustadt and R. A. Schmidt. MSPASS: Modal reasoning by translation and first-order resolution. In R

  2. Logic-Based Models for the Analysis of Cell Signaling Networks†

    PubMed Central

    2010-01-01

    Computational models are increasingly used to analyze the operation of complex biochemical networks, including those involved in cell signaling networks. Here we review recent advances in applying logic-based modeling to mammalian cell biology. Logic-based models represent biomolecular networks in a simple and intuitive manner without describing the detailed biochemistry of each interaction. A brief description of several logic-based modeling methods is followed by six case studies that demonstrate biological questions recently addressed using logic-based models and point to potential advances in model formalisms and training procedures that promise to enhance the utility of logic-based methods for studying the relationship between environmental inputs and phenotypic or signaling state outputs of complex signaling networks. PMID:20225868

  3. A methodology to migrate the gene ontology to a description logic environment using DAML+OIL.

    PubMed

    Wroe, C J; Stevens, R; Goble, C A; Ashburner, M

    2003-01-01

    The Gene Ontology Next Generation Project (GONG) is developing a staged methodology to evolve the current representation of the Gene Ontology into DAML+OIL in order to take advantage of the richer formal expressiveness and the reasoning capabilities of the underlying description logic. Each stage provides a step level increase in formal explicit semantic content with a view to supporting validation, extension and multiple classification of the Gene Ontology. The paper introduces DAML+OIL and demonstrates the activity within each stage of the methodology and the functionality gained.

  4. Logic Dynamics for Deductive Inference -- Its Stability and Neural Basis

    NASA Astrophysics Data System (ADS)

    Tsuda, Ichiro

    2014-12-01

    We propose a dynamical model that represents a process of deductive inference. We discuss the stability of logic dynamics and a neural basis for the dynamics. We propose a new concept of descriptive stability, thereby enabling a structure of stable descriptions of mathematical models concerning dynamic phenomena to be clarified. The present theory is based on the wider and deeper thoughts of John S. Nicolis. In particular, it is based on our joint paper on the chaos theory of human short-term memories with a magic number of seven plus or minus two.

  5. Specifying structural constraints of architectural patterns in the ARCHERY language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, Alejandro; HASLab INESC TEC and Universidade do Minho, Campus de Gualtar, 4710-057 Braga; Barbosa, Luis S.

    ARCHERY is an architectural description language for modelling and reasoning about distributed, heterogeneous and dynamically reconfigurable systems in terms of architectural patterns. The language supports the specification of architectures and their reconfiguration. This paper introduces a language extension for precisely describing the structural design decisions that pattern instances must respect in their (re)configurations. The extension is a propositional modal logic with recursion and nominals referencing components, i.e., a hybrid µ-calculus. Its expressiveness allows specifying safety and liveness constraints, as well as paths and cycles over structures. Refinements of classic architectural patterns are specified.

  6. An Adaptive Fuzzy-Logic Traffic Control System in Conditions of Saturated Transport Stream

    PubMed Central

    Marakhimov, A. R.; Igamberdiev, H. Z.; Umarov, Sh. X.

    2016-01-01

    This paper considers the problem of building adaptive fuzzy-logic traffic control systems (AFLTCS) to deal with information fuzziness and uncertainty in case of heavy traffic streams. Methods of formal description of traffic control on the crossroads based on fuzzy sets and fuzzy logic are proposed. This paper also provides efficient algorithms for implementing AFLTCS and develops the appropriate simulation models to test the efficiency of suggested approach. PMID:27517081

  7. A Process Algebraic Approach to Software Architecture Design

    NASA Astrophysics Data System (ADS)

    Aldini, Alessandro; Bernardo, Marco; Corradini, Flavio

    Process algebra is a formal tool for the specification and the verification of concurrent and distributed systems. It supports compositional modeling through a set of operators able to express concepts like sequential composition, alternative composition, and parallel composition of action-based descriptions. It also supports mathematical reasoning via a two-level semantics, which formalizes the behavior of a description by means of an abstract machine obtained from the application of structural operational rules and then introduces behavioral equivalences able to relate descriptions that are syntactically different. In this chapter, we present the typical behavioral operators and operational semantic rules for a process calculus in which no notion of time, probability, or priority is associated with actions. Then, we discuss the three most studied approaches to the definition of behavioral equivalences - bisimulation, testing, and trace - and we illustrate their congruence properties, sound and complete axiomatizations, modal logic characterizations, and verification algorithms. Finally, we show how these behavioral equivalences and some of their variants are related to each other on the basis of their discriminating power.

  8. Introducing Programmable Logic to Undergraduate Engineering Students in a Digital Electronics Course

    ERIC Educational Resources Information Center

    Todorovich, E.; Marone, J. A.; Vazquez, M.

    2012-01-01

    Due to significant technological advances and industry requirements, many universities have introduced programmable logic and hardware description languages into undergraduate engineering curricula. This has led to a number of logistical and didactical challenges, in particular for computer science students. In this paper, the integration of some…

  9. Programming Programmable Logic Controller. High-Technology Training Module.

    ERIC Educational Resources Information Center

    Lipsky, Kevin

    This training module on programming programmable logic controllers (PLC) is part of the memory structure and programming unit used in a packaging systems equipment control course. In the course, students assemble, install, maintain, and repair industrial machinery used in industry. The module contains description, objectives, content outline,…

  10. The evolutionary diversity of insect retinal mosaics: common design principles and emerging molecular logic.

    PubMed

    Wernet, Mathias F; Perry, Michael W; Desplan, Claude

    2015-06-01

    Independent evolution has resulted in a vast diversity of eyes. Despite the lack of a common Bauplan or ancestral structure, similar developmental strategies are used. For instance, different classes of photoreceptor cells (PRs) are distributed stochastically and/or localized in different regions of the retina. Here, we focus on recent progress made towards understanding the molecular principles behind patterning retinal mosaics of insects, one of the most diverse groups of animals adapted to life on land, in the air, under water, or on the water surface. Morphological, physiological, and behavioral studies from many species provide detailed descriptions of the vast variation in retinal design and function. By integrating this knowledge with recent progress in the characterization of insect Rhodopsins as well as insight from the model organism Drosophila melanogaster, we seek to identify the molecular logic behind the adaptation of retinal mosaics to the habitat and way of life of an animal. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A behavioral-level HDL description of SFQ logic circuits for quantitative performance analysis of large-scale SFQ digital systems

    NASA Astrophysics Data System (ADS)

    Matsuzaki, F.; Yoshikawa, N.; Tanaka, M.; Fujimaki, A.; Takai, Y.

    2003-10-01

    Recently many single flux quantum (SFQ) logic circuits containing several thousands of Josephson junctions have been designed successfully by using digital domain simulation based on the hard ware description language (HDL). In the present HDL-based design of SFQ circuits, a structure-level HDL description has been used, where circuits are made up of basic gate cells. However, in order to analyze large-scale SFQ digital systems, such as a microprocessor, more higher-level circuit abstraction is necessary to reduce the circuit simulation time. In this paper we have investigated the way to describe functionality of the large-scale SFQ digital circuits by a behavior-level HDL description. In this method, the functionality and the timing of the circuit block is defined directly by describing their behavior by the HDL. Using this method, we can dramatically reduce the simulation time of large-scale SFQ digital circuits.

  12. Peirce and the Art of Reasoning

    ERIC Educational Resources Information Center

    Anderson, Doug

    2005-01-01

    Drawing on Charles Peirce's descriptions of his correspondence course on the "Art of Reasoning," I argue that Peirce believed that the study of logic stands at the center of a liberal arts education. However, Peirce's notion of logic included much more than the traditional accounts of deduction and syllogistic reasoning. He believed that the art…

  13. Radically Democratic Learning in the Grounded In-Between

    ERIC Educational Resources Information Center

    Hart, Mechthild

    2010-01-01

    The author describes how the political struggles of immigrant domestic workers challenge the destructive logic of a global capitalist patriarchy. She uses her involvement in the political struggles of immigrant domestic workers as the foundation of both a critique of the destructive logic of global capitalist relations as well as a description of…

  14. Shuttle program. MCC Level C formulation requirements: Entry guidance and entry autopilot

    NASA Technical Reports Server (NTRS)

    Harpold, J. C.; Hill, O.

    1980-01-01

    A set of preliminary entry guidance and autopilot software formulations is presented for use in the Mission Control Center (MCC) entry processor. These software formulations meet all level B requirements. Revision 2 incorporates the modifications required to functionally simulate optimal TAEM targeting capability (OTT). Implementation of this logic in the MCC must be coordinated with flight software OTT implementation and MCC TAEM guidance OTT. The entry guidance logic is based on the Orbiter avionics entry guidance software. This MCC requirements document contains a definition of coordinate systems, a list of parameter definitions for the software formulations, a description of the entry guidance detailed formulation requirements, a description of the detailed autopilot formulation requirements, a description of the targeting routine, and a set of formulation flow charts.

  15. Regular paths in SparQL: querying the NCI Thesaurus.

    PubMed

    Detwiler, Landon T; Suciu, Dan; Brinkley, James F

    2008-11-06

    OWL, the Web Ontology Language, provides syntax and semantics for representing knowledge for the semantic web. Many of the constructs of OWL have a basis in the field of description logics. While the formal underpinnings of description logics have lead to a highly computable language, it has come at a cognitive cost. OWL ontologies are often unintuitive to readers lacking a strong logic background. In this work we describe GLEEN, a regular path expression library, which extends the RDF query language SparQL to support complex path expressions over OWL and other RDF-based ontologies. We illustrate the utility of GLEEN by showing how it can be used in a query-based approach to defining simpler, more intuitive views of OWL ontologies. In particular we show how relatively simple GLEEN-enhanced SparQL queries can create views of the OWL version of the NCI Thesaurus that match the views generated by the web-based NCI browser.

  16. SAT Encoding of Unification in EL

    NASA Astrophysics Data System (ADS)

    Baader, Franz; Morawska, Barbara

    Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. In a recent paper, we have shown that unification in EL is NP-complete, and thus of a complexity that is considerably lower than in other Description Logics of comparably restricted expressive power. In this paper, we introduce a new NP-algorithm for solving unification problems in EL, which is based on a reduction to satisfiability in propositional logic (SAT). The advantage of this new algorithm is, on the one hand, that it allows us to employ highly optimized state-of-the-art SAT solvers when implementing an EL-unification algorithm. On the other hand, this reduction provides us with a proof of the fact that EL-unification is in NP that is much simpler than the one given in our previous paper on EL-unification.

  17. Challenges to understanding spatial patterns of disease: philosophical alternatives to logical positivism.

    PubMed

    Mayer, J D

    1992-08-01

    Most studies of disease distribution, in medical geography and other related disciplines, have been empirical in nature and rooted in the assumptions of logical positivism. However, some of the more newly articulated philosophies of the social sciences, and of social theory, have much to add in the understanding of the processes and mechanisms underlying disease distribution. This paper represents a plea for creative synthesis between logical positivism and realism or structuration, and uses specific examples to suggest how disease distribution, as a surface phenomenon, can be explained using deeper analysis.

  18. Don't Fear Optimality: Sampling for Probabilistic-Logic Sequence Models

    NASA Astrophysics Data System (ADS)

    Thon, Ingo

    One of the current challenges in artificial intelligence is modeling dynamic environments that change due to the actions or activities undertaken by people or agents. The task of inferring hidden states, e.g. the activities or intentions of people, based on observations is called filtering. Standard probabilistic models such as Dynamic Bayesian Networks are able to solve this task efficiently using approximative methods such as particle filters. However, these models do not support logical or relational representations. The key contribution of this paper is the upgrade of a particle filter algorithm for use with a probabilistic logical representation through the definition of a proposal distribution. The performance of the algorithm depends largely on how well this distribution fits the target distribution. We adopt the idea of logical compilation into Binary Decision Diagrams for sampling. This allows us to use the optimal proposal distribution which is normally prohibitively slow.

  19. An iLab for Teaching Advanced Logic Concepts with Hardware Descriptive Languages

    ERIC Educational Resources Information Center

    Ayodele, Kayode P.; Inyang, Isaac A.; Kehinde, Lawrence O.

    2015-01-01

    One of the more interesting approaches to teaching advanced logic concepts is the use of online laboratory frameworks to provide student access to remote field-programmable devices. There is as yet, however, no conclusive evidence of the effectiveness of such an approach. This paper presents the Advanced Digital Lab, a remote laboratory based on…

  20. R-189 (C-620) air compressor control logic software documentation. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, K.E.

    1995-06-08

    This relates to FFTF plant air compressors. Purpose of this document is to provide an updated Computer Software Description for the software to be used on R-189 (C-620-C) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started.

  1. MSEA--The Department's Microcomputer. The Illinois Series on Educational Applications of Computer. Report No. 27.

    ERIC Educational Resources Information Center

    Muiznieks, Viktors

    This report provides a technical description and operating guidelines for the IMSAI 8080 microcomputer in the Department of Secondary Education at the University of Illinois. An overview of the microcomputer highlights the register array, address logic, arithmetic and logical unit, instruction register and control section, and the data bus buffer.…

  2. Text Cohesion and Comprehension: A Comparison of Prose Analysis Systems.

    ERIC Educational Resources Information Center

    Varnhagen, Connie K.; Goldman, Susan R.

    To test three specific hypotheses about recall as a function of four categories of logical relations, a study was done to determine whether logical relations systems of prose analysis can be used to predict recall. Two descriptive passages of naturally occurring expository prose were used. Each text was parsed into 45 statements, consisting of…

  3. Cooperative Learning Model toward a Reading Comprehensions on the Elementary School

    ERIC Educational Resources Information Center

    Murtono

    2015-01-01

    The purposes of this research are: (1) description of reading skill the students who join in CIRC learning model, Jigsaw learning model, and STAD learning model; (2) finding out the effective of learning model cooperative toward a reading comprehensions between the students who have high language logic and low language logic; and (3) finding out…

  4. PC-403: Pioneer Venus multiprobe spacecraft mission operational characteristics document, volume 1

    NASA Technical Reports Server (NTRS)

    Barker, F. C.

    1978-01-01

    The operational characteristics of the multiprobe system and its subsystem are described. System level, description of the nominal phases, system interfaces, and the capabilities and limitations of system level performance are presented. Bus spacecraft functional and operational descriptions at the subsystem and unit level are presented. The subtleties of nominal operation as well as detailed capabilities and limitations beyond nominal performance are discussed. A command and telemetry logic flow diagram for each subsystem is included. Each diagram identifies in symbolic logic all signal conditioning encountered along each command signal path into, and each telemetry signal path out of the subsystem.

  5. The next generation of similarity measures that fully explore the semantics in biomedical ontologies.

    PubMed

    Couto, Francisco M; Pinto, H Sofia

    2013-10-01

    There is a prominent trend to augment and improve the formality of biomedical ontologies. For example, this is shown by the current effort on adding description logic axioms, such as disjointness. One of the key ontology applications that can take advantage of this effort is the conceptual (functional) similarity measurement. The presence of description logic axioms in biomedical ontologies make the current structural or extensional approaches weaker and further away from providing sound semantics-based similarity measures. Although beneficial in small ontologies, the exploration of description logic axioms by semantics-based similarity measures is computational expensive. This limitation is critical for biomedical ontologies that normally contain thousands of concepts. Thus in the process of gaining their rightful place, biomedical functional similarity measures have to take the journey of finding how this rich and powerful knowledge can be fully explored while keeping feasible computational costs. This manuscript aims at promoting and guiding the development of compelling tools that deliver what the biomedical community will require in a near future: a next-generation of biomedical similarity measures that efficiently and fully explore the semantics present in biomedical ontologies.

  6. Semiosis stems from logical incompatibility in organic nature: Why biophysics does not see meaning, while biosemiotics does.

    PubMed

    Kull, Kalevi

    2015-12-01

    We suggest here a model of the origin of the phenomenal world via the naturalization of logical conflict or incompatibility (which is broader than, but includes logical contradiction). Physics rules out the reality of meaning because of the method of formalization, which requires that logical conflicts cannot be part of the model. We argue that (a) meaning-making requires a logical conflict; (b) logical conflict assumes a phenomenal present; (c) phenomenological specious present occurs in living systems as widely as meaning-making; (d) it is possible to provide a physiological description of a system in which the phenomenal present appears and choices are made; (e) logical conflict, or incompatibility itself, is the mechanism of intentionality; (f) meaning-making is assured by scaffolding, which is a product of earlier choices, or decision-making, or interpretation. This model can be seen as a model of semiosis. It also allows putting physiology and phenomenology (or physics and semiotics) into a natural connection. Copyright © 2015. Published by Elsevier Ltd.

  7. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  8. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  9. A Formalized Design Process for Bacterial Consortia That Perform Logic Computing

    PubMed Central

    Sun, Rui; Xi, Jingyi; Wen, Dingqiao; Feng, Jingchen; Chen, Yiwei; Qin, Xiao; Ma, Yanrong; Luo, Wenhan; Deng, Linna; Lin, Hanchi; Yu, Ruofan; Ouyang, Qi

    2013-01-01

    The concept of microbial consortia is of great attractiveness in synthetic biology. Despite of all its benefits, however, there are still problems remaining for large-scaled multicellular gene circuits, for example, how to reliably design and distribute the circuits in microbial consortia with limited number of well-behaved genetic modules and wiring quorum-sensing molecules. To manage such problem, here we propose a formalized design process: (i) determine the basic logic units (AND, OR and NOT gates) based on mathematical and biological considerations; (ii) establish rules to search and distribute simplest logic design; (iii) assemble assigned basic logic units in each logic operating cell; and (iv) fine-tune the circuiting interface between logic operators. We in silico analyzed gene circuits with inputs ranging from two to four, comparing our method with the pre-existing ones. Results showed that this formalized design process is more feasible concerning numbers of cells required. Furthermore, as a proof of principle, an Escherichia coli consortium that performs XOR function, a typical complex computing operation, was designed. The construction and characterization of logic operators is independent of “wiring” and provides predictive information for fine-tuning. This formalized design process provides guidance for the design of microbial consortia that perform distributed biological computation. PMID:23468999

  10. Business logic for geoprocessing of distributed geodata

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian

    2006-12-01

    This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).

  11. Composition for Critical Thinking: A Course Description.

    ERIC Educational Resources Information Center

    Lazere, Donald

    Intended for college or secondary school teachers of courses beyond the basic level in freshman English and composition, this course description treats components of composition for critical thinking, including semantics, tone, logic, and argumentation, and their application to writing critical, argumentative, and research papers. The introduction…

  12. Building a Library for Microelectronics Verification with Topological Constraints

    DTIC Science & Technology

    2017-03-01

    Tables 1d, 3b); 1-bit full adder cell (Fig. 1), respectively. Table 5. Frequency distributions for the genus of logically equivalent circuit...Figure 1 shows that switching signal pairs produces logically- equivalent topologies of the 1-bit full adder cell with three values of the genus (g = 3 [1...case], 4, 5, 6). Figure 1. Frequency distribution for logically equivalent circuit topologies of the 1-bit full adder cell (2048) in Table 1(e

  13. A distributed control system for the lower-hybrid current drive system on the Tokamak de Varennes

    NASA Astrophysics Data System (ADS)

    Bagdoo, J.; Guay, J. M.; Chaudron, G.-A.; Decoste, R.; Demers, Y.; Hubbard, A.

    1990-08-01

    An rf current drive system with an output power of 1 MW at 3.7 GHz is under development for the Tokamak de Varennes. The control system is based on an Ethernet local-area network of programmable logic controllers as front end, personal computers as consoles, and CAMAC-based DSP processors. The DSP processors ensure the PID control of the phase and rf power of each klystron, and the fast protection of high-power rf hardware, all within a 40 μs loop. Slower control and protection, event sequencing and the run-time database are provided by the programmable logic controllers, which communicate, via the LAN, with the consoles. The latter run a commercial process-control console software. The LAN protocol respects the first four layers of the ISO/OSI 802.3 standard. Synchronization with the tokamak control system is provided by commercially available CAMAC timing modules which trigger shot-related events and reference waveform generators. A detailed description of each subsystem and a performance evaluation of the system will be presented.

  14. Advanced Platform Systems Technology study. Volume 4: Technology advancement program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An overview study of the major technology definition tasks and subtasks along with their interfaces and interrelationships is presented. Although not specifically indicated in the diagram, iterations were required at many steps to finalize the results. The development of the integrated technology advancement plan was initiated by using the results of the previous two tasks, i.e., the trade studies and the preliminary cost and schedule estimates for the selected technologies. Descriptions for the development of each viable technology advancement was drawn from the trade studies. Additionally, a logic flow diagram depicting the steps in developing each technology element was developed along with descriptions for each of the major elements. Next, major elements of the logic flow diagrams were time phased, and that allowed the definition of a technology development schedule that was consistent with the space station program schedule when possible. Schedules show the major milestone including tests required as described in the logic flow diagrams.

  15. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed

    Schulz, S; Romacker, M; Hahn, U

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.

  16. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  17. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed Central

    Schulz, S.; Romacker, M.; Hahn, U.

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335

  18. Documentation for the machine-readable version of A Library of Stellar Spectra (Jacoby, Hunter and Christian 1984)

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1984-01-01

    The machine readable library as it is currently being distributed from the Astronomical Data Center is described. The library contains digital spectral for 161 stars of spectral classes O through M and luminosity classes 1, 3 and 5 in the wavelength range 3510 A to 7427 A. The resolution is approximately 4.5 A, while the typical photometric uncertainty of each resolution element is approximately 1 percent and broadband variations are 3 percent. The documentation includes a format description, a table of the indigenous characteristics of the magnetic tape file, and a sample listing of logical records exactly as they are recorded on the tape.

  19. The Use of Computer Simulation Techniques in Educational Planning.

    ERIC Educational Resources Information Center

    Wilson, Charles Z.

    Computer simulations provide powerful models for establishing goals, guidelines, and constraints in educational planning. They are dynamic models that allow planners to examine logical descriptions of organizational behavior over time as well as permitting consideration of the large and complex systems required to provide realistic descriptions of…

  20. Blade loss transient dynamics analysis. Volume 3: User's manual for TETRA program

    NASA Technical Reports Server (NTRS)

    Black, G. R.; Gallardo, V. C.; Storace, A. S.; Sagendorph, F.

    1981-01-01

    The users manual for TETRA contains program logic, flow charts, error messages, input sheets, modeling instructions, option descriptions, input variable descriptions, and demonstration problems. The process of obtaining a NASTRAN 17.5 generated modal input file for TETRA is also described with a worked sample.

  1. Method for spatially distributing a population

    DOEpatents

    Bright, Edward A [Knoxville, TN; Bhaduri, Budhendra L [Knoxville, TN; Coleman, Phillip R [Knoxville, TN; Dobson, Jerome E [Lawrence, KS

    2007-07-24

    A process for spatially distributing a population count within a geographically defined area can include the steps of logically correlating land usages apparent from a geographically defined area to geospatial features in the geographically defined area and allocating portions of the population count to regions of the geographically defined area having the land usages, according to the logical correlation. The process can also include weighing the logical correlation for determining the allocation of portions of the population count and storing the allocated portions within a searchable data store. The logically correlating step can include the step of logically correlating time-based land usages to geospatial features of the geographically defined area. The process can also include obtaining a population count for the geographically defined area, organizing the geographically defined area into a plurality of sectors, and verifying the allocated portions according to direct observation.

  2. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    PubMed Central

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-01-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626

  3. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.

    PubMed

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-15

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  4. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  5. Visual unit analysis: a descriptive approach to landscape assessment

    Treesearch

    R. J. Tetlow; S. R. J. Sheppard

    1979-01-01

    Analysis of the visible attributes of landscapes is an important component of the planning process. When landscapes are at regional scale, economical and effective methodologies are critical. The Visual Unit concept appears to offer a logical and useful framework for description and evaluation. The concept subdivides landscape into coherent, spatially-defined units....

  6. Solar heating and hot water system installed at Charlotte Memorial Hospital, Charlotte, North Carolina

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Detailed information regarding the design and installation of a heating and hot water system in a commercial application is given. This information includes descriptions of system and building, design philosophy, control logic operation modes, design and installation drawing and a brief description of problems encountered and their solutions.

  7. Integrated payload and mission planning, phase 3. Volume 2: Logic/Methodology for preliminary grouping of spacelab and mixed cargo payloads

    NASA Technical Reports Server (NTRS)

    Rodgers, T. E.; Johnson, J. F.

    1977-01-01

    The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.

  8. Logical inference approach to relativistic quantum mechanics: Derivation of the Klein–Gordon equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donker, H.C., E-mail: h.donker@science.ru.nl; Katsnelson, M.I.; De Raedt, H.

    2016-09-15

    The logical inference approach to quantum theory, proposed earlier De Raedt et al. (2014), is considered in a relativistic setting. It is shown that the Klein–Gordon equation for a massive, charged, and spinless particle derives from the combination of the requirements that the space–time data collected by probing the particle is obtained from the most robust experiment and that on average, the classical relativistic equation of motion of a particle holds. - Highlights: • Logical inference applied to relativistic, massive, charged, and spinless particle experiments leads to the Klein–Gordon equation. • The relativistic Hamilton–Jacobi is scrutinized by employing a field description formore » the four-velocity. • Logical inference allows analysis of experiments with uncertainty in detection events and experimental conditions.« less

  9. F-OWL: An Inference Engine for Semantic Web

    NASA Technical Reports Server (NTRS)

    Zou, Youyong; Finin, Tim; Chen, Harry

    2004-01-01

    Understanding and using the data and knowledge encoded in semantic web documents requires an inference engine. F-OWL is an inference engine for the semantic web language OWL language based on F-logic, an approach to defining frame-based systems in logic. F-OWL is implemented using XSB and Flora-2 and takes full advantage of their features. We describe how F-OWL computes ontology entailment and compare it with other description logic based approaches. We also describe TAGA, a trading agent environment that we have used as a test bed for F-OWL and to explore how multiagent systems can use semantic web concepts and technology.

  10. Empirical ethics, context-sensitivity, and contextualism.

    PubMed

    Musschenga, Albert W

    2005-10-01

    In medical ethics, business ethics, and some branches of political philosophy (multi-culturalism, issues of just allocation, and equitable distribution) the literature increasingly combines insights from ethics and the social sciences. Some authors in medical ethics even speak of a new phase in the history of ethics, hailing "empirical ethics" as a logical next step in the development of practical ethics after the turn to "applied ethics." The name empirical ethics is ill-chosen because of its associations with "descriptive ethics." Unlike descriptive ethics, however, empirical ethics aims to be both descriptive and normative. The first question on which I focus is what kind of empirical research is used by empirical ethics and for which purposes. I argue that the ultimate aim of all empirical ethics is to improve the context-sensitivity of ethics. The second question is whether empirical ethics is essentially connected with specific positions in meta-ethics. I show that in some kinds of meta-ethical theories, which I categorize as broad contextualist theories, there is an intrinsic need for connecting normative ethics with empirical social research. But context-sensitivity is a goal that can be aimed for from any meta-ethical position.

  11. On a concept of computer game implementation based on a temporal logic

    NASA Astrophysics Data System (ADS)

    Szymańska, Emilia; Adamek, Marek J.; Mulawka, Jan J.

    2017-08-01

    Time is a concept which underlies all the contemporary civilization. Therefore, it was necessary to create mathematical tools that allow a precise way to describe the complex time dependencies. One such tool is temporal logic. Its definition, description and characteristics will be presented in this publication. Then the authors will conduct a discussion on the usefulness of this tool in context of creating storyline in computer games such as RPG genre.

  12. Design of automata theory of cubical complexes with applications to diagnosis and algorithmic description

    NASA Technical Reports Server (NTRS)

    Roth, J. P.

    1972-01-01

    Methods for development of logic design together with algorithms for failure testing, a method for design of logic for ultra-large-scale integration, extension of quantum calculus to describe the functional behavior of a mechanism component-by-component and to computer tests for failures in the mechanism using the diagnosis algorithm, and the development of an algorithm for the multi-output 2-level minimization problem are discussed.

  13. A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models

    DTIC Science & Technology

    2007-11-01

    Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to

  14. A Logical Framework for Distributed Data

    DTIC Science & Technology

    1990-11-01

    A Logical Framework for Distributed Data lLl6ll1󈧆AH43 44592 -001-05-3301 A~UTHO(S RDT&E 44043-010-37 Paul Broome and Barbara Broome 1L162618AH80... 44592 -002-46-3702 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 3.PERFORMG ORGANIZATION REPORT NUMBER U. S. Army Ballistic Research Laboratory ATTN

  15. Enzymatic AND logic gates operated under conditions characteristic of biomedical applications.

    PubMed

    Melnikov, Dmitriy; Strack, Guinevere; Zhou, Jian; Windmiller, Joshua Ray; Halámek, Jan; Bocharova, Vera; Chuang, Min-Chieh; Santhosh, Padmanabhan; Privman, Vladimir; Wang, Joseph; Katz, Evgeny

    2010-09-23

    Experimental and theoretical analyses of the lactate dehydrogenase and glutathione reductase based enzymatic AND logic gates in which the enzymes and their substrates serve as logic inputs are performed. These two systems are examples of the novel, previously unexplored class of biochemical logic gates that illustrate potential biomedical applications of biochemical logic. They are characterized by input concentrations at logic 0 and 1 states corresponding to normal and pathophysiological conditions. Our analysis shows that the logic gates under investigation have similar noise characteristics. Both significantly amplify random noise present in inputs; however, we establish that for realistic widths of the input noise distributions, it is still possible to differentiate between the logic 0 and 1 states of the output. This indicates that reliable detection of pathophysiological conditions is indeed possible with such enzyme logic systems.

  16. Programmable Potentials: Approximate N-body potentials from coarse-level logic.

    PubMed

    Thakur, Gunjan S; Mohr, Ryan; Mezić, Igor

    2016-09-27

    This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the "coefficients" of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.

  17. Programmable Potentials: Approximate N-body potentials from coarse-level logic

    NASA Astrophysics Data System (ADS)

    Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor

    2016-09-01

    This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.

  18. Programmable Potentials: Approximate N-body potentials from coarse-level logic

    PubMed Central

    Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor

    2016-01-01

    This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out. PMID:27671683

  19. Mathematical description and program documentation for CLASSY, an adaptive maximum likelihood clustering method

    NASA Technical Reports Server (NTRS)

    Lennington, R. K.; Rassbach, M. E.

    1979-01-01

    Discussed in this report is the clustering algorithm CLASSY, including detailed descriptions of its general structure and mathematical background and of the various major subroutines. The report provides a development of the logic and equations used with specific reference to program variables. Some comments on timing and proposed optimization techniques are included.

  20. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  1. The development of a simulation model of primary prevention strategies for coronary heart disease.

    PubMed

    Babad, Hannah; Sanderson, Colin; Naidoo, Bhash; White, Ian; Wang, Duolao

    2002-11-01

    This paper describes the present state of development of a discrete-event micro-simulation model for coronary heart disease prevention. The model is intended to support health policy makers in assessing the impacts on health care resources of different primary prevention strategies. For each person, a set of times to disease events, conditional on the individual's risk factor profile, is sampled from a set of probability distributions that are derived from a new analysis of the Framingham cohort study on coronary heart disease. Methods used to model changes in behavioural and physiological risk factors are discussed and a description of the simulation logic is given. The model incorporates POST (Patient Oriented Simulation Technique) simulation routines.

  2. Experiments on neural network architectures for fuzzy logic

    NASA Technical Reports Server (NTRS)

    Keller, James M.

    1991-01-01

    The use of fuzzy logic to model and manage uncertainty in a rule-based system places high computational demands on an inference engine. In an earlier paper, the authors introduced a trainable neural network structure for fuzzy logic. These networks can learn and extrapolate complex relationships between possibility distributions for the antecedents and consequents in the rules. Here, the power of these networks is further explored. The insensitivity of the output to noisy input distributions (which are likely if the clauses are generated from real data) is demonstrated as well as the ability of the networks to internalize multiple conjunctive clause and disjunctive clause rules. Since different rules with the same variables can be encoded in a single network, this approach to fuzzy logic inference provides a natural mechanism for rule conflict resolution.

  3. Implementing a Microcontroller Watchdog with a Field-Programmable Gate Array (FPGA)

    NASA Technical Reports Server (NTRS)

    Straka, Bartholomew

    2013-01-01

    Reliability is crucial to safety. Redundancy of important system components greatly enhances reliability and hence safety. Field-Programmable Gate Arrays (FPGAs) are useful for monitoring systems and handling the logic necessary to keep them running with minimal interruption when individual components fail. A complete microcontroller watchdog with logic for failure handling can be implemented in a hardware description language (HDL.). HDL-based designs are vendor-independent and can be used on many FPGAs with low overhead.

  4. Dynamic partial reconfiguration of logic controllers implemented in FPGAs

    NASA Astrophysics Data System (ADS)

    Bazydło, Grzegorz; Wiśniewski, Remigiusz

    2016-09-01

    Technological progress in recent years benefits in digital circuits containing millions of logic gates with the capability for reprogramming and reconfiguring. On the one hand it provides the unprecedented computational power, but on the other hand the modelled systems are becoming increasingly complex, hierarchical and concurrent. Therefore, abstract modelling supported by the Computer Aided Design tools becomes a very important task. Even the higher consumption of the basic electronic components seems to be acceptable because chip manufacturing costs tend to fall over the time. The paper presents a modelling approach for logic controllers with the use of Unified Modelling Language (UML). Thanks to the Model Driven Development approach, starting with a UML state machine model, through the construction of an intermediate Hierarchical Concurrent Finite State Machine model, a collection of Verilog files is created. The system description generated in hardware description language can be synthesized and implemented in reconfigurable devices, such as FPGAs. Modular specification of the prototyped controller permits for further dynamic partial reconfiguration of the prototyped system. The idea bases on the exchanging of the functionality of the already implemented controller without stopping of the FPGA device. It means, that a part (for example a single module) of the logic controller is replaced by other version (called context), while the rest of the system is still running. The method is illustrated by a practical example by an exemplary Home Area Network system.

  5. 46 CFR 62.20-1 - Plans for approval.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... console, panel, and enclosure layouts. (3) Schematic or logic diagrams including functional relationships... features. (6) A description of built-in test features and diagnostics. (7) Design Verification and Periodic...

  6. 46 CFR 62.20-1 - Plans for approval.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... console, panel, and enclosure layouts. (3) Schematic or logic diagrams including functional relationships... features. (6) A description of built-in test features and diagnostics. (7) Design Verification and Periodic...

  7. 46 CFR 62.20-1 - Plans for approval.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... console, panel, and enclosure layouts. (3) Schematic or logic diagrams including functional relationships... features. (6) A description of built-in test features and diagnostics. (7) Design Verification and Periodic...

  8. 46 CFR 62.20-1 - Plans for approval.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... console, panel, and enclosure layouts. (3) Schematic or logic diagrams including functional relationships... features. (6) A description of built-in test features and diagnostics. (7) Design Verification and Periodic...

  9. 46 CFR 62.20-1 - Plans for approval.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... console, panel, and enclosure layouts. (3) Schematic or logic diagrams including functional relationships... programmable features. (6) A description of built-in test features and diagnostics. (7) Design Verification and...

  10. A computer program for the generation of logic networks from task chart data

    NASA Technical Reports Server (NTRS)

    Herbert, H. E.

    1980-01-01

    The Network Generation Program (NETGEN), which creates logic networks from task chart data is presented. NETGEN is written in CDC FORTRAN IV (Extended) and runs in a batch mode on the CDC 6000 and CYBER 170 series computers. Data is input via a two-card format and contains information regarding the specific tasks in a project. From this data, NETGEN constructs a logic network of related activities with each activity having unique predecessor and successor nodes, activity duration, descriptions, etc. NETGEN then prepares this data on two files that can be used in the Project Planning Analysis and Reporting System Batch Network Scheduling program and the EZPERT graphics program.

  11. A computer method of finding valuations forcing validity of LC formulae

    NASA Astrophysics Data System (ADS)

    Godlewski, Łukasz; Świetorzecka, Kordula; Mulawka, Jan

    2014-11-01

    The purpose of this paper is to present the computer implementation of a system known as LC temporal logic [1]. Firstly, to become familiar with some theoretical issues, a short introduction to this logic is discussed. The algorithms allowing a deep analysis of the formulae of LC logic are considered. In particular we discuss how to determine if a formula is a tautology, contrtautology or it is satisfable. Next, we show how to find all valuations to satisfy the formula. Finally, we consider finding histories generated by the formula and transforming these histories into the state machine. Moreover, a description of the experiments that verify the implementation are briefly presented.

  12. Banning standard cell engineering notebook

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A family of standardized thick-oxide P-MOS building blocks (standard cells) is described. The information is presented in a form useful for systems designs, logic design, and the preparation of inputs to both sets of Design Automation programs for array design and analysis. A data sheet is provided for each cell and gives the cell name, the cell number, its logic symbol, Boolean equation, truth table, circuit schematic circuit composite, input-output capacitances, and revision date. The circuit type file, also given for each cell, together with the logic drawing contained on the data sheet provides all the information required to prepare input data files for the Design Automation Systems. A detailed description of the electrical design procedure is included.

  13. Groundwater Circulating Well Assessment and Guidance

    DTIC Science & Technology

    1998-04-03

    47 3 . 1 Decis ion Tree and Process Description...two GCW systems p laced c lose enough to affect each other significantly (Herding et al. , 1 994). This type of wel l spaci ng may be requ ired to...3.1 Decision Tree and Process Description The process for screening the GCW technology is a logical sequence of steps during which site­ specific

  14. 15 CFR 971.203 - Commercial recovery plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Applications Contents... requirements for resource assessment and logical mining unit (§ 971.501); (6) A description of the methods and...

  15. 15 CFR 971.203 - Commercial recovery plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Applications Contents... requirements for resource assessment and logical mining unit (§ 971.501); (6) A description of the methods and...

  16. Development of a program logic model and evaluation plan for a participatory ergonomics intervention in construction.

    PubMed

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2014-03-01

    Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.

  17. Development of a Program Logic Model and Evaluation Plan for a Participatory Ergonomics Intervention in Construction

    PubMed Central

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2013-01-01

    Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097

  18. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  19. Stinger Post Hybrid Simulation: Design Description and Users’ Manual

    DTIC Science & Technology

    1983-04-01

    function of three variables. Table 4 MVFG Modes £ ?*" Mode Function Number Argument Number Trunking Station Address Output Hole Argument...between JA and J7. This permits the signal to control a normally-open, single-throw minature relay. It closes the remote operate line when a logic...logic functions are also performed on the card. A normally- open, single-throw minature relay (U5) and an AND gate (1/4 of Ul) are used to perform the

  20. Adaption of a corrector module to the IMP dynamics program

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The corrector module of the RAEIOS program and the IMP dynamics computer program were combined to achieve a date-fitting capability with the more general spacecraft dynamics models of the IMP program. The IMP dynamics program presents models of spacecraft dynamics for satellites with long, flexible booms. The properties of the corrector are discussed and a description is presented of the performance criteria and search logic for parameter estimation. A description is also given of the modifications made to add the corrector to the IMP program. This includes subroutine descriptions, common definitions, definition of input, and a description of output.

  1. Mathematical modelling of Bit-Level Architecture using Reciprocal Quantum Logic

    NASA Astrophysics Data System (ADS)

    Narendran, S.; Selvakumar, J.

    2018-04-01

    Efficiency of high-performance computing is on high demand with both speed and energy efficiency. Reciprocal Quantum Logic (RQL) is one of the technology which will produce high speed and zero static power dissipation. RQL uses AC power supply as input rather than DC input. RQL has three set of basic gates. Series of reciprocal transmission lines are placed in between each gate to avoid loss of power and to achieve high speed. Analytical model of Bit-Level Architecture are done through RQL. Major drawback of reciprocal Quantum Logic is area, because of lack in proper power supply. To achieve proper power supply we need to use splitters which will occupy large area. Distributed arithmetic uses vector- vector multiplication one is constant and other is signed variable and each word performs as a binary number, they rearranged and mixed to form distributed system. Distributed arithmetic is widely used in convolution and high performance computational devices.

  2. Strategic Control Algorithm Development : Volume 4A. Computer Program Report.

    DOT National Transportation Integrated Search

    1974-08-01

    A description of the strategic algorithm evaluation model is presented, both at the user and programmer levels. The model representation of an airport configuration, environmental considerations, the strategic control algorithm logic, and the airplan...

  3. Freight data architecture business process, logical data model, and physical data model.

    DOT National Transportation Integrated Search

    2014-09-01

    This document summarizes the study teams efforts to establish data-sharing partnerships : and relay the lessons learned. In addition, it provides information on a prototype freight data : architecture and supporting description and specifications ...

  4. Strategic Control Algorithm Development : Volume 4B. Computer Program Report (Concluded)

    DOT National Transportation Integrated Search

    1974-08-01

    A description of the strategic algorithm evaluation model is presented, both at the user and programmer levels. The model representation of an airport configuration, environmental considerations, the strategic control algorithm logic, and the airplan...

  5. Can Quantum-Mechanical Description of Physical Reality Be Considered Correct?

    NASA Astrophysics Data System (ADS)

    Brassard, Gilles; Méthot, André Allan

    2010-04-01

    In an earlier paper written in loving memory of Asher Peres, we gave a critical analysis of the celebrated 1935 paper in which Einstein, Podolsky and Rosen (EPR) challenged the completeness of quantum mechanics. There, we had pointed out logical shortcomings in the EPR paper. Now, we raise additional questions concerning their suggested program to find a theory that would “provide a complete description of the physical reality”. In particular, we investigate the extent to which the EPR argumentation could have lead to the more dramatic conclusion that quantum mechanics is in fact incorrect. With this in mind, we propose a speculation, made necessary by a logical shortcoming in the EPR paper caused by the lack of a necessary condition for “elements of reality”, and surmise that an eventually complete theory would either be inconsistent with quantum mechanics, or would at least violate Heisenberg’s Uncertainty Principle.

  6. Modeling a description logic vocabulary for cancer research.

    PubMed

    Hartel, Frank W; de Coronado, Sherri; Dionne, Robert; Fragoso, Gilberto; Golbeck, Jennifer

    2005-04-01

    The National Cancer Institute has developed the NCI Thesaurus, a biomedical vocabulary for cancer research, covering terminology across a wide range of cancer research domains. A major design goal of the NCI Thesaurus is to facilitate translational research. We describe: the features of Ontylog, a description logic used to build NCI Thesaurus; our methodology for enhancing the terminology through collaboration between ontologists and domain experts, and for addressing certain real world challenges arising in modeling the Thesaurus; and finally, we describe the conversion of NCI Thesaurus from Ontylog into Web Ontology Language Lite. Ontylog has proven well suited for constructing big biomedical vocabularies. We have capitalized on the Ontylog constructs Kind and Role in the collaboration process described in this paper to facilitate communication between ontologists and domain experts. The artifacts and processes developed by NCI for collaboration may be useful in other biomedical terminology development efforts.

  7. 'Swapna' in the Indian classics: Mythology or science?

    PubMed

    Tendulkar, Sonali S; Dwivedi, R R

    2010-04-01

    There are many concepts in Ayurveda as well as the ancient sciences that are untouched or unexplored. One such concept is that of the Swapna (dreams). Being an abstract phenomenon it makes it difficult to be explained and understood; probably because of this the descriptions related to Swapna in the Indian classics are supported by mythology, to make them acceptable. Variations in these explanations are seen according to the objective of the school of thought; that is, in the ancient texts where dreams are used to delve into the knowledge of the Atman and are related to spirituality, its description in the Ayurvedic texts evolves around the Sharira and Manas. Although all these explanations seem to be shrouded in uncertainty and mythology; there definitely seems to be a logical and rational science behind these quotations. They only need research, investigation, and explanation on the basis of logic, and a laboratory.

  8. ‘Swapna’ in the Indian classics: Mythology or science?

    PubMed Central

    Tendulkar, Sonali S.; Dwivedi, R. R.

    2010-01-01

    There are many concepts in Ayurveda as well as the ancient sciences that are untouched or unexplored. One such concept is that of the Swapna (dreams). Being an abstract phenomenon it makes it difficult to be explained and understood; probably because of this the descriptions related to Swapna in the Indian classics are supported by mythology, to make them acceptable. Variations in these explanations are seen according to the objective of the school of thought; that is, in the ancient texts where dreams are used to delve into the knowledge of the Atman and are related to spirituality, its description in the Ayurvedic texts evolves around the Sharira and Manas. Although all these explanations seem to be shrouded in uncertainty and mythology; there definitely seems to be a logical and rational science behind these quotations. They only need research, investigation, and explanation on the basis of logic, and a laboratory. PMID:22131706

  9. Designing a Virtual-Memory Implementation Using the Motorola MC68010 16- Bit Microprocessor with Multi-Processor Capability Interfaced to the VMEbus

    DTIC Science & Technology

    1990-06-01

    RAM and ROM output enable signals. Figure C.7 shows the logic for the interrupt priority level (IPLO* through IPL2 *) and the interrupt acknowledge...IACK681* signal is sent to the DUART when a level one interrupt acknowledge is output by the CPU. The logic for the IACK681* and the IPLO* through IPL2 ...signals are actually implemented with an EPLD. Listing D.4 in Appendix D presents the Abel description of the IACK681* and IPLO* through IPL2

  10. F-15 digital electronic engine control system description

    NASA Technical Reports Server (NTRS)

    Myers, L. P.

    1984-01-01

    A digital electronic engine control (DEEC) was developed for use on the F100-PW-100 turbofan engine. This control system has full authority control, capable of moving all the controlled variables over their full ranges. The digital computational electronics and fault detection and accomodation logic maintains safe engine operation. A hydromechanical backup control (BUC) is an integral part of the fuel metering unit and provides gas generator control at a reduced performance level in the event of an electronics failure. The DEEC's features, hardware, and major logic diagrams are described.

  11. Proof Search in an Authorization Logic

    DTIC Science & Technology

    2009-04-14

    and Itay Neeman. DKAL: Distributed-knowledge authorization language. In Proceedings of the 21st IEEE Symposium on Computer Security Foundations (CSF...21), 2008. [33] Yuri Gurevich and Itay Neeman. The logic of infons. Technical report, Microsoft Research, 2009. [34] Joshua S. Hodas and Dale Miller

  12. A logical foundation for representation of clinical data.

    PubMed Central

    Campbell, K E; Das, A K; Musen, M A

    1994-01-01

    OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805

  13. High bit rate convolutional channel encoder/decoder

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A detailed description of the design approach and tradeoffs encountered during the development of the 50 MBPS decoder system is presented. A functional analysis of each of the major logical functions is given, and the system's major components are listed.

  14. Is Science Logical?

    ERIC Educational Resources Information Center

    Pease, Craig M.; Bull, J. J.

    1992-01-01

    Offers a concise, abstract description of the scientific method different from the historical, philosophical, and case-study approaches, which lead to comprehension of this method. Discusses features of scientific models, dynamic interactions underlying scientific progress, ways that scientist successfully understand nature, mechanisms for…

  15. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen

    1988-01-01

    Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6)…

  16. Automating generation of textual class definitions from OWL to English.

    PubMed

    Stevens, Robert; Malone, James; Williams, Sandra; Power, Richard; Third, Allan

    2011-05-17

    Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as 'coherent' a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much 'formal ontology' was not liked; and that too much explicit exposure of OWL semantics was also not liked. Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the developers of EFO are sufficiently satisfied with the output that the generated definitions have been incorporated into EFO. Whilst not a substitute for hand-written textual definitions, our generated definitions are a useful starting point. An on-line version of the NLG text definition tool can be found at http://swat.open.ac.uk/tools/. The questionaire and sample generated text definitions may be found at http://mcs.open.ac.uk/nlg/SWAT/bio-ontologies.html.

  17. Automating generation of textual class definitions from OWL to English

    PubMed Central

    2011-01-01

    Background Text definitions for entities within bio-ontologies are a cornerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag between specification of the main part of an ontology (logical descriptions and definitions of entities) and the development of the text-based definitions. The goal of natural language generation (NLG) from ontologies is to take the logical description of entities and generate fluent natural language. The application described here uses NLG to automatically provide text-based definitions from an ontology that has logical descriptions of its entities, so avoiding the bottleneck of authoring these definitions by hand. Results To produce the descriptions, the program collects all the axioms relating to a given entity, groups them according to common structure, realises each group through an English sentence, and assembles the resulting sentences into a paragraph, to form as ‘coherent’ a text as possible without human intervention. Sentence generation is accomplished using a generic grammar based on logical patterns in OWL, together with a lexicon for realising atomic entities. We have tested our output for the Experimental Factor Ontology (EFO) using a simple survey strategy to explore the fluency of the generated text and how well it conveys the underlying axiomatisation. Two rounds of survey and improvement show that overall the generated English definitions are found to convey the intended meaning of the axiomatisation in a satisfactory manner. The surveys also suggested that one form of generated English will not be universally liked; that intrusion of too much ‘formal ontology’ was not liked; and that too much explicit exposure of OWL semantics was also not liked. Conclusions Our prototype tools can generate reasonable paragraphs of English text that can act as definitions. The definitions were found acceptable by our survey and, as a result, the developers of EFO are sufficiently satisfied with the output that the generated definitions have been incorporated into EFO. Whilst not a substitute for hand-written textual definitions, our generated definitions are a useful starting point. Availability An on-line version of the NLG text definition tool can be found at http://swat.open.ac.uk/tools/. The questionaire and sample generated text definitions may be found at http://mcs.open.ac.uk/nlg/SWAT/bio-ontologies.html. PMID:21624160

  18. Design and performance evaluation of a distributed OFDMA-based MAC protocol for MANETs.

    PubMed

    Park, Jaesung; Chung, Jiyoung; Lee, Hyungyu; Lee, Jung-Ryun

    2014-01-01

    In this paper, we propose a distributed MAC protocol for OFDMA-based wireless mobile ad hoc multihop networks, in which the resource reservation and data transmission procedures are operated in a distributed manner. A frame format is designed considering the characteristics of OFDMA that each node can transmit or receive data to or from multiple nodes simultaneously. Under this frame structure, we propose a distributed resource management method including network state estimation and resource reservation processes. We categorize five types of logical errors according to their root causes and show that two of the logical errors are inevitable while three of them are avoided under the proposed distributed MAC protocol. In addition, we provide a systematic method to determine the advertisement period of each node by presenting a clear relation between the accuracy of estimated network states and the signaling overhead. We evaluate the performance of the proposed protocol in respect of the reservation success rate and the success rate of data transmission. Since our method focuses on avoiding logical errors, it could be easily placed on top of the other resource allocation methods focusing on the physical layer issues of the resource management problem and interworked with them.

  19. Elements of decisional dynamics: An agent-based approach applied to artificial financial market

    NASA Astrophysics Data System (ADS)

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2018-02-01

    This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).

  20. Elements of decisional dynamics: An agent-based approach applied to artificial financial market.

    PubMed

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2018-02-01

    This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).

  1. Lyceum: A Multi-Protocol Digital Library Gateway

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    Lyceum is a prototype scalable query gateway that provides a logically central interface to multi-protocol and physically distributed, digital libraries of scientific and technical information. Lyceum processes queries to multiple syntactically distinct search engines used by various distributed information servers from a single logically central interface without modification of the remote search engines. A working prototype (http://www.larc.nasa.gov/lyceum/) demonstrates the capabilities, potentials, and advantages of this type of meta-search engine by providing access to over 50 servers covering over 20 disciplines.

  2. Reconfigurable logic via gate controlled domain wall trajectory in magnetic network structure

    PubMed Central

    Murapaka, C.; Sethi, P.; Goolaup, S.; Lew, W. S.

    2016-01-01

    An all-magnetic logic scheme has the advantages of being non-volatile and energy efficient over the conventional transistor based logic devices. In this work, we present a reconfigurable magnetic logic device which is capable of performing all basic logic operations in a single device. The device exploits the deterministic trajectory of domain wall (DW) in ferromagnetic asymmetric branch structure for obtaining different output combinations. The programmability of the device is achieved by using a current-controlled magnetic gate, which generates a local Oersted field. The field generated at the magnetic gate influences the trajectory of the DW within the structure by exploiting its inherent transverse charge distribution. DW transformation from vortex to transverse configuration close to the output branch plays a pivotal role in governing the DW chirality and hence the output. By simply switching the current direction through the magnetic gate, two universal logic gate functionalities can be obtained in this device. Using magnetic force microscopy imaging and magnetoresistance measurements, all basic logic functionalities are demonstrated. PMID:26839036

  3. Hardware synthesis from DDL description. [simulating a digital system for computerized design of large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.; Shah, A. M.

    1980-01-01

    The details of digital systems can be conveniently input into the design automation system by means of hardware description language (HDL). The computer aided design and test (CADAT) system at NASA MSFC is used for the LSI design. The digital design language (DDL) was selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. Problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system are addressed.

  4. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services

    PubMed Central

    Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-01-01

    Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs. PMID:19775460

  5. An Interval Type-2 Fuzzy Multiple Echelon Supply Chain Model

    NASA Astrophysics Data System (ADS)

    Miller, Simon; John, Robert

    Planning resources for a supply chain is a major factor determining its success or failure. In this paper we build on previous work introducing an Interval Type-2 Fuzzy Logic model of a multiple echelon supply chain. It is believed that the additional degree of uncertainty provided by Interval Type-2 Fuzzy Logic will allow for better representation of the uncertainty and vagueness present in resource planning models. First, the subject of Supply Chain Management is introduced, then some background is given on related work using Type-1 Fuzzy Logic. A description of the Interval Type-2 Fuzzy model is given, and a test scenario detailed. A Genetic Algorithm uses the model to search for a near-optimal plan for the scenario. A discussion of the results follows, along with conclusions and details of intended further work.

  6. The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport

    NASA Astrophysics Data System (ADS)

    Poliaková, Adela

    2015-06-01

    The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.

  7. Research in mathematical theory of computation. [computer programming applications

    NASA Technical Reports Server (NTRS)

    Mccarthy, J.

    1973-01-01

    Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.

  8. Machine Learning-based Intelligent Formal Reasoning and Proving System

    NASA Astrophysics Data System (ADS)

    Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia

    2018-03-01

    The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.

  9. Index to Computer Assisted Instruction.

    ERIC Educational Resources Information Center

    Lekan, Helen A., Ed.

    The computer assisted instruction (CAI) programs and projects described in this index are listed by subject matter. The index gives the program name, author, source, description, prerequisites, level of instruction, type of student, average completion time, logic and program, purpose for which program was designed, supplementary…

  10. Logic Gates Made of N-Channel JFETs and Epitaxial Resistors

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael J.

    2008-01-01

    Prototype logic gates made of n-channel junction field-effect transistors (JFETs) and epitaxial resistors have been demonstrated, with a view toward eventual implementation of digital logic devices and systems in silicon carbide (SiC) integrated circuits (ICs). This development is intended to exploit the inherent ability of SiC electronic devices to function at temperatures from 300 to somewhat above 500 C and withstand large doses of ionizing radiation. SiC-based digital logic devices and systems could enable operation of sensors and robots in nuclear reactors, in jet engines, near hydrothermal vents, and in other environments that are so hot or radioactive as to cause conventional silicon electronic devices to fail. At present, current needs for digital processing at high temperatures exceed SiC integrated circuit production capabilities, which do not allow for highly integrated circuits. Only single to small number component production of depletion mode n-channel JFETs and epitaxial resistors on a single substrate is possible. As a consequence, the fine matching of components is impossible, resulting in rather large direct-current parameter distributions within a group of transistors typically spanning multiples of 5 to 10. Add to this the lack of p-channel devices to complement the n-channel FETs, the lack of precise dropping diodes, and the lack of enhancement mode devices at these elevated temperatures and the use of conventional direct coupled and buffered direct coupled logic gate design techniques is impossible. The presented logic gate design is tolerant of device parameter distributions and is not hampered by the lack of complementary devices or dropping diodes. In addition to n-channel JFETs, these gates include level-shifting and load resistors (see figure). Instead of relying on precise matching of parameters among individual JFETS, these designs rely on choosing the values of these resistors and of supply potentials so as to make the circuits perform the desired functions throughout the ranges over which the parameters of the JFETs are distributed. The supply rails V(sub dd) and V(sub ss) and the resistors R are chosen as functions of the distribution of direct-current operating parameters of the group of transistors used.

  11. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  12. Program logic: a framework for health program design and evaluation - the Pap nurse in general practice program.

    PubMed

    Hallinan, Christine M

    2010-01-01

    In this paper, program logic will be used to 'map out' the planning, development and evaluation of the general practice Pap nurse program in the Australian general practice arena. The incorporation of program logic into the evaluative process supports a greater appreciation of the theoretical assumptions and external influences that underpin general practice Pap nurse activity. The creation of a program logic model is a conscious strategy that results an explicit understanding of the challenges ahead, the resources available and time frames for outcomes. Program logic also enables a recognition that all players in the general practice arena need to be acknowledged by policy makers, bureaucrats and program designers when addressing through policy, issues relating to equity and accessibility of health initiatives. Logic modelling allows decision makers to consider the complexities of causal associations when developing health care proposals and programs. It enables the Pap nurse in general practice program to be represented diagrammatically by linking outcomes (short, medium and long term) with both the program activities and program assumptions. The research methodology used in the evaluation of the Pap nurse in general practice program includes a descriptive study design and the incorporation of program logic, with a retrospective analysis of Australian data from 2001 to 2009. For the purposes of gaining both empirical and contextual data for this paper, a data set analysis and literature review was performed. The application of program logic as an evaluative tool for analysis of the Pap PN incentive program facilitates a greater understanding of complex general practice activity triggers, and also allows this greater understanding to be incorporated into policy to facilitate Pap PN activity, increase general practice cervical smear and ultimately decrease burden of disease.

  13. Army Battlefield Distribution Through the Lens of OIF: Logical Failures and the Way Ahead

    DTIC Science & Technology

    2005-02-02

    3 Historical Context of Logistics and Distribution Management Transformation...THEATER DISTRIBUTION UNITS ............................................... 66 iii TABLE OF FIGURES Figure 1. Distribution Management Center...consumer and a potential provider of logistics.8 Historical Context of Logistics and Distribution Management Transformation The critical role of

  14. CATS Household Travel Survey, Volume One: Documentation for the Chicago Central Business District

    DOT National Transportation Integrated Search

    1989-09-01

    This report contains descriptions of the surveying concepts, the editing and : coding logic, the data base structure, several summary tables and the data base : for the Chicago Central Business District. Also, because the data at this time : are unfa...

  15. An Exchange on "Truth and Methods."

    ERIC Educational Resources Information Center

    Caughie, Pamela L.; Dasenbrock, Reed Way

    1996-01-01

    Takes issue with Reed Way Dasenbrock's criticism of literary theory and the terms under which literary interpretation and discussion take place. Presents Dasenbrock's reply, which discusses his understanding of certain terms (evidence, truth, debate), his description of the problem, and the logical contradictions he finds internal to…

  16. Conceptualizing Organizational Climates. Research Report No. 7.

    ERIC Educational Resources Information Center

    Schneider, Benjamin

    Part 1 of this paper presents some logical and conceptual distinctions between job satisfaction and organizational climate, the former being viewed as micro, evaluative, individual perceptions of personal events and experiences the latter as macro, relatively descriptive, organizational level perceptions that are abstractions of organizational…

  17. DDL:Digital systems design language

    NASA Technical Reports Server (NTRS)

    Shival, S. G.

    1980-01-01

    Hardware description languages are valuable tools in such applications as hardware design, system documentation, and logic design training. DDL is convenient medium for inputting design details into hardware-design automation system. It is suitable for describing digital systems at gate, register transfer, and major combinational block level.

  18. It Takes Two to Tango: Customization and Standardization as Colluding Logics in Healthcare Comment on "(Re) Making the Procrustean Bed Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Greenfield, David; Eljiz, Kathy; Butler-Henderson, Kerryn

    2017-06-28

    The healthcare context is characterized with new developments, technologies, ideas and expectations that are continually reshaping the frontline of care delivery. Mannion and Exworthy identify two key factors driving this complexity, 'standardization' and 'customization,' and their apparent resulting paradox to be negotiated by healthcare professionals, managers and policy makers. However, while they present a compelling argument an alternative viewpoint exists. An analysis is presented that shows instead of being 'competing' logics in healthcare, standardization and customization are long standing 'colluding' logics. Mannion and Exworthy's call for further sustained work to understand this complex, contested space is endorsed, noting that it is critical to inform future debates and service decisions. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  19. Modeling of single event transients with dual double-exponential current sources: Implications for logic cell characterization

    DOE PAGES

    Black, Dolores Archuleta; Robinson, William H.; Wilcox, Ian Zachary; ...

    2015-08-07

    Single event effects (SEE) are a reliability concern for modern microelectronics. Bit corruptions can be caused by single event upsets (SEUs) in the storage cells or by sampling single event transients (SETs) from a logic path. Likewise, an accurate prediction of soft error susceptibility from SETs requires good models to convert collected charge into compact descriptions of the current injection process. This paper describes a simple, yet effective, method to model the current waveform resulting from a charge collection event for SET circuit simulations. The model uses two double-exponential current sources in parallel, and the results illustrate why a conventionalmore » model based on one double-exponential source can be incomplete. Furthermore, a small set of logic cells with varying input conditions, drive strength, and output loading are simulated to extract the parameters for the dual double-exponential current sources. As a result, the parameters are based upon both the node capacitance and the restoring current (i.e., drive strength) of the logic cell.« less

  20. Rocketdyne Safety Algorithm: Space Shuttle Main Engine Fault Detection

    NASA Technical Reports Server (NTRS)

    Norman, Arnold M., Jr.

    1994-01-01

    The Rocketdyne Safety Algorithm (RSA) has been developed to the point of use on the TTBE at MSFC on Task 4 of LeRC contract NAS3-25884. This document contains a description of the work performed, the results of the nominal test of the major anomaly test cases and a table of the resulting cutoff times, a plot of the RSA value vs. time for each anomaly case, a logic flow description of the algorithm, the algorithm code, and a development plan for future efforts.

  1. Toward Question-Asking Machines: The Logic of Questions and the Inquiry Calculus

    NASA Technical Reports Server (NTRS)

    Knuth,Kevin H.

    2005-01-01

    For over a century, the study of logic has focused on the algebra of logical statements. This work, first performed by George Boole, has led to the development of modern computers, and was shown by Richard T. Cox to be the foundation of Bayesian inference. Meanwhile the logic of questions has been much neglected. For our computing machines to be truly intelligent, they need to be able to ask relevant questions. In this paper I will show how the Boolean lattice of logical statements gives rise to the free distributive lattice of questions thus defining their algebra. Furthermore, there exists a quantity analogous to probability, called relevance, which quantifies the degree to which one question answers another. I will show that relevance is not only a natural generalization of information theory, but also forms its foundation.

  2. Classical Limit and Quantum Logic

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Fortin, Sebastian; Holik, Federico

    2018-02-01

    The analysis of the classical limit of quantum mechanics usually focuses on the state of the system. The general idea is to explain the disappearance of the interference terms of quantum states appealing to the decoherence process induced by the environment. However, in these approaches it is not explained how the structure of quantum properties becomes classical. In this paper, we consider the classical limit from a different perspective. We consider the set of properties of a quantum system and we study the quantum-to-classical transition of its logical structure. The aim is to open the door to a new study based on dynamical logics, that is, logics that change over time. In particular, we appeal to the notion of hybrid logics to describe semiclassical systems. Moreover, we consider systems with many characteristic decoherence times, whose sublattices of properties become distributive at different times.

  3. Design of automata theory of cubical complexes with applications to diagnosis and algorithmic description

    NASA Technical Reports Server (NTRS)

    Roth, J. P.

    1972-01-01

    The following problems are considered: (1) methods for development of logic design together with algorithms, so that it is possible to compute a test for any failure in the logic design, if such a test exists, and developing algorithms and heuristics for the purpose of minimizing the computation for tests; and (2) a method of design of logic for ultra LSI (large scale integration). It was discovered that the so-called quantum calculus can be extended to render it possible: (1) to describe the functional behavior of a mechanism component by component, and (2) to compute tests for failures, in the mechanism, using the diagnosis algorithm. The development of an algorithm for the multioutput two-level minimization problem is presented and the program MIN 360 was written for this algorithm. The program has options of mode (exact minimum or various approximations), cost function, cost bound, etc., providing flexibility.

  4. The semantics of fuzzy logic

    NASA Technical Reports Server (NTRS)

    Ruspini, Enrique H.

    1991-01-01

    Summarized here are the results of recent research on the conceptual foundations of fuzzy logic. The focus is primarily on the principle characteristics of a model that quantifies resemblance between possible worlds by means of a similarity function that assigns a number between 0 and 1 to every pair of possible worlds. Introduction of such a function permits one to interpret the major constructs and methods of fuzzy logic: conditional and unconditional possibility and necessity distributions and the generalized modus ponens of Zadeh on the basis of related metric relationships between subsets of possible worlds.

  5. Moral Education and the Perils of Developmentalism.

    ERIC Educational Resources Information Center

    Carr, David

    2002-01-01

    Discusses conception of moral formation. Traces progress to moral maturity through well defined stages of cognitive, conative, and/or affective growth. Explains that logical status of developmental theories are not clear. Argues that the accounts are more evaluative than descriptive. Explores the problematic moral educational implications of this…

  6. 44 CFR 360.2 - Description of program.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... application, to be accompanied by a Training and Education (T&E) plan for a total of three years, only the... three year comprehensive Training and Education Program planning can proceed in a timely and logical... HOMELAND SECURITY PREPAREDNESS STATE ASSISTANCE PROGRAMS FOR TRAINING AND EDUCATION IN COMPREHENSIVE...

  7. 44 CFR 360.2 - Description of program.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... application, to be accompanied by a Training and Education (T&E) plan for a total of three years, only the... three year comprehensive Training and Education Program planning can proceed in a timely and logical... HOMELAND SECURITY PREPAREDNESS STATE ASSISTANCE PROGRAMS FOR TRAINING AND EDUCATION IN COMPREHENSIVE...

  8. 44 CFR 360.2 - Description of program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... application, to be accompanied by a Training and Education (T&E) plan for a total of three years, only the... three year comprehensive Training and Education Program planning can proceed in a timely and logical... HOMELAND SECURITY PREPAREDNESS STATE ASSISTANCE PROGRAMS FOR TRAINING AND EDUCATION IN COMPREHENSIVE...

  9. 44 CFR 360.2 - Description of program.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... application, to be accompanied by a Training and Education (T&E) plan for a total of three years, only the... three year comprehensive Training and Education Program planning can proceed in a timely and logical... HOMELAND SECURITY PREPAREDNESS STATE ASSISTANCE PROGRAMS FOR TRAINING AND EDUCATION IN COMPREHENSIVE...

  10. 44 CFR 360.2 - Description of program.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... application, to be accompanied by a Training and Education (T&E) plan for a total of three years, only the... three year comprehensive Training and Education Program planning can proceed in a timely and logical... HOMELAND SECURITY PREPAREDNESS STATE ASSISTANCE PROGRAMS FOR TRAINING AND EDUCATION IN COMPREHENSIVE...

  11. The Critical Thinking Worksheet.

    ERIC Educational Resources Information Center

    Agnew, Priscilla

    A description is provided of the use of a Critical Thinking Worksheet as a pedagogical tool for introducing critical thinking to students of philosophy and informal logic. First, an accompanying handout introduces essential terms such as "argument,""conclusion," and "reasons or premises." The worksheet itself constitutes the second page of the…

  12. The Student-Teacher-Computer Team: Focus on the Computer.

    ERIC Educational Resources Information Center

    Ontario Inst. for Studies in Education, Toronto.

    Descriptions of essential computer elements, logic and programing techniques, and computer applications are provided in an introductory handbook for use by educators and students. Following a brief historical perspective, the organization of a computer system is schematically illustrated, functions of components are explained in non-technical…

  13. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.

    1978-05-01

    This volume provides a listing of the BNW-II dry/wet ammonia heat rejection optimization code and is an appendix to Volume I which gives a narrative description of the code's algorithms as well as logic, input and output information.

  14. Ontology-Based Approach to Social Data Sentiment Analysis: Detection of Adolescent Depression Signals.

    PubMed

    Jung, Hyesil; Park, Hyeoun-Ae; Song, Tae-Min

    2017-07-24

    Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, "academic stresses" and "suicide" contributed negatively to the sentiment of adolescent depression. The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology. ©Hyesil Jung, Hyeoun-Ae Park, Tae-Min Song. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.07.2017.

  15. Ontology-Based Approach to Social Data Sentiment Analysis: Detection of Adolescent Depression Signals

    PubMed Central

    Jung, Hyesil; Song, Tae-Min

    2017-01-01

    Background Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. Objective The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. Methods The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. Results We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, “academic stresses” and “suicide” contributed negatively to the sentiment of adolescent depression. Conclusions The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology. PMID:28739560

  16. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  17. A constraint logic programming approach to associate 1D and 3D structural components for large protein complexes.

    PubMed

    Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang

    2007-01-01

    The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.

  18. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    1999-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter the focus is on some experimental data on low voltage drop out regulators to support mixed 5 and 3.3 volt systems. A discussion of the Small Explorer WIRE spacecraft will also be given. Lastly, we show take a first look at robust state machines in Hardware Description Languages (VHDL) and their use in critical systems. If you have information that you would like to submit or an area you would like discussed or researched, please give me a call or e-mail.

  19. Using Pipelined XNOR Logic to Reduce SEU Risks in State Machines

    NASA Technical Reports Server (NTRS)

    Le, Martin; Zheng, Xin; Katanyoutant, Sunant

    2008-01-01

    Single-event upsets (SEUs) pose great threats to avionic systems state machine control logic, which are frequently used to control sequence of events and to qualify protocols. The risks of SEUs manifest in two ways: (a) the state machine s state information is changed, causing the state machine to unexpectedly transition to another state; (b) due to the asynchronous nature of SEU, the state machine's state registers become metastable, consequently causing any combinational logic associated with the metastable registers to malfunction temporarily. Effect (a) can be mitigated with methods such as triplemodular redundancy (TMR). However, effect (b) cannot be eliminated and can degrade the effectiveness of any mitigation method of effect (a). Although there is no way to completely eliminate the risk of SEU-induced errors, the risk can be made very small by use of a combination of very fast state-machine logic and error-detection logic. Therefore, one goal of two main elements of the present method is to design the fastest state-machine logic circuitry by basing it on the fastest generic state-machine design, which is that of a one-hot state machine. The other of the two main design elements is to design fast error-detection logic circuitry and to optimize it for implementation in a field-programmable gate array (FPGA) architecture: In the resulting design, the one-hot state machine is fitted with a multiple-input XNOR gate for detection of illegal states. The XNOR gate is implemented with lookup tables and with pipelines for high speed. In this method, the task of designing all the logic must be performed manually because no currently available logic synthesis software tool can produce optimal solutions of design problems of this type. However, some assistance is provided by a script, written for this purpose in the Python language (an object-oriented interpretive computer language) to automatically generate hardware description language (HDL) code from state-transition rules.

  20. Upper-Bound Estimates Of SEU in CMOS

    NASA Technical Reports Server (NTRS)

    Edmonds, Larry D.

    1990-01-01

    Theory of single-event upsets (SEU) (changes in logic state caused by energetic charged subatomic particles) in complementary metal oxide/semiconductor (CMOS) logic devices extended to provide upper-bound estimates of rates of SEU when limited experimental information available and configuration and dimensions of SEU-sensitive regions of devices unknown. Based partly on chord-length-distribution method.

  1. Comparison of depth-dose distributions of proton therapeutic beams calculated by means of logical detectors and ionization chamber modeled in Monte Carlo codes

    NASA Astrophysics Data System (ADS)

    Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej

    2016-08-01

    The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.

  2. Examining the development of scientific reasoning in ninth-grade physical science students

    NASA Astrophysics Data System (ADS)

    Westbrook, Susan L.; Rogers, Laura N.

    This study-was designed to test the hypothesis that descriptive learning cycles are neither sufficient to stimulate students to reason at a formal operational level nor to encourage facility with the processes of scientific investigation. A 6-week long, three-investigation unit on simple machines drawn from a ninth-grade physical science curriculum was selected for the study. Students in the course were assigned to one of three instructional groups: descriptive group (DE), question design group (QD), and hypothesis testing group (HT). Each group completed identical exploration and invention activities. Each group participated in qualitatively distinct activities during the expansion phase. The DE students completed the activities outlined in the curriculum (a descriptive learning cycle). The QD group designed and conducted experiments to answer a question posed by the teacher. The HT group generated hypotheses concerning a problem, then designed and conducted experiments to test those hypotheses (a hypothetico-deductive expansion). The effects of the treatments were assessed in a pretest-posttest format using Lawson's Seven Logic-Tasks, the Test of Integrated Process Skills, and Lawson's Revised Classroom Test of Scientific Reasoning. Analyses of the data indicated that the HT group exhibited a significant increase on the Test of Integrated Process Skills and on Task 1 of the Seven Logic Tasks during the 6-week period.

  3. An Automated Design Framework for Multicellular Recombinase Logic.

    PubMed

    Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome

    2018-05-18

    Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.

  4. FPGA-based gating and logic for multichannel single photon counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pooser, Raphael C; Earl, Dennis Duncan; Evans, Philip G

    2012-01-01

    We present results characterizing multichannel InGaAs single photon detectors utilizing gated passive quenching circuits (GPQC), self-differencing techniques, and field programmable gate array (FPGA)-based logic for both diode gating and coincidence counting. Utilizing FPGAs for the diode gating frontend and the logic counting backend has the advantage of low cost compared to custom built logic circuits and current off-the-shelf detector technology. Further, FPGA logic counters have been shown to work well in quantum key distribution (QKD) test beds. Our setup combines multiple independent detector channels in a reconfigurable manner via an FPGA backend and post processing in order to perform coincidencemore » measurements between any two or more detector channels simultaneously. Using this method, states from a multi-photon polarization entangled source are detected and characterized via coincidence counting on the FPGA. Photons detection events are also processed by the quantum information toolkit for application testing (QITKAT)« less

  5. E.S.T. and the Oracle.

    ERIC Educational Resources Information Center

    Richardson, Ian M.

    1990-01-01

    A possible syllabus for English for Science and Technology is suggested based upon a set of causal relations, arising from a logical description of the presuppositional rhetoric of scientific passages that underlie most semantic functions. An empirical study is reported of the semantic functions present in 52 randomly selected passages.…

  6. Solar water heater design package

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Package describes commercial domestic-hot-water heater with roof or rack mounted solar collectors. System is adjustable to pre-existing gas or electric hot-water house units. Design package includes drawings, description of automatic control logic, evaluation measurements, possible design variations, list of materials and installation tools, and trouble-shooting guide and manual.

  7. Reading: Cognitive Input and Output.

    ERIC Educational Resources Information Center

    Kopp, Harriet Green

    Descriptions of language learning and reading behaviors are presented, in this paper, within the context of a model of cognitive processing that reflects a continuum for the logical procession of language skills in human maturation and learning. Portions of the paper differentiate silent and oral reading in terms of cognitive load, which is a…

  8. Approximate spatial reasoning

    NASA Technical Reports Server (NTRS)

    Dutta, Soumitra

    1988-01-01

    A model for approximate spatial reasoning using fuzzy logic to represent the uncertainty in the environment is presented. Algorithms are developed which can be used to reason about spatial information expressed in the form of approximate linguistic descriptions similar to the kind of spatial information processed by humans. Particular attention is given to static spatial reasoning.

  9. National Assessment of Vocational Education: Interim Report to Congress.

    ERIC Educational Resources Information Center

    Silverberg, Marsha; Warner, Elizabeth; Goodwin, David; Fong, Michael

    Analyses completed prior to November 2001 provided a context for examining vocational education (VE) and a description of participation at secondary and postsecondary levels, a logical first step in evaluating VE's status and effectiveness. They were a small, but significant part of a comprehensive research agenda being conducted under the…

  10. Understanding How Babies Build Language Skills

    ERIC Educational Resources Information Center

    Honig, Alice Sterling

    2006-01-01

    Language is a great communication system. Through language, humans can express logical reasoning, grief, happiness, wishes, descriptions, and a rich array of feelings and ideas. Every baby deserves the gift of language power! In this article, the author discusses how babies build language skills and presents activities to help babies build…

  11. Baby Events: Assembling Descriptions of Infants in Family Day Care

    ERIC Educational Resources Information Center

    Bradley, Ben; Sumsion, Jennifer; Stratigos, Tina; Elwick, Sheena

    2012-01-01

    The idea that research on infants should "voice" their "perspectives", their experiences, what they are "really saying," is a central feature of current moves toward participatory research. While embracing the ethos of participation, this article steps away from the binary logic of identity that implicitly underpins…

  12. 77 FR 65878 - Application for Final Commitment for a Long-term Loan or Financial Guarantee in Excess of $100...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-31

    ... export of semiconductor manufacturing equipment to Germany. Brief non-proprietary description of the anticipated use of the items being exported: Equipment supports the manufacture of logic semiconductors. To... United States industry. Parties: Principal Suppliers: Applied Materials, Inc., KLA-Tencor Corporation...

  13. Distributed autonomous systems: resource management, planning, and control algorithms

    NASA Astrophysics Data System (ADS)

    Smith, James F., III; Nguyen, ThanhVu H.

    2005-05-01

    Distributed autonomous systems, i.e., systems that have separated distributed components, each of which, exhibit some degree of autonomy are increasingly providing solutions to naval and other DoD problems. Recently developed control, planning and resource allocation algorithms for two types of distributed autonomous systems will be discussed. The first distributed autonomous system (DAS) to be discussed consists of a collection of unmanned aerial vehicles (UAVs) that are under fuzzy logic control. The UAVs fly and conduct meteorological sampling in a coordinated fashion determined by their fuzzy logic controllers to determine the atmospheric index of refraction. Once in flight no human intervention is required. A fuzzy planning algorithm determines the optimal trajectory, sampling rate and pattern for the UAVs and an interferometer platform while taking into account risk, reliability, priority for sampling in certain regions, fuel limitations, mission cost, and related uncertainties. The real-time fuzzy control algorithm running on each UAV will give the UAV limited autonomy allowing it to change course immediately without consulting with any commander, request other UAVs to help it, alter its sampling pattern and rate when observing interesting phenomena, or to terminate the mission and return to base. The algorithms developed will be compared to a resource manager (RM) developed for another DAS problem related to electronic attack (EA). This RM is based on fuzzy logic and optimized by evolutionary algorithms. It allows a group of dissimilar platforms to use EA resources distributed throughout the group. For both DAS types significant theoretical and simulation results will be presented.

  14. Assessment of Evidence-based Management Training Program: Application of a Logic Model.

    PubMed

    Guo, Ruiling; Farnsworth, Tracy J; Hermanson, Patrick M

    2016-06-01

    The purposes of this study were to apply a logic model to plan and implement an evidence-based management (EBMgt) educational training program for healthcare administrators and to examine whether a logic model is a useful tool for evaluating the outcomes of the educational program. The logic model was used as a conceptual framework to guide the investigators in developing an EBMgt educational training program and evaluating the outcomes of the program. The major components of the logic model were constructed as inputs, outputs, and outcomes/impacts. The investigators delineated the logic model based on the results of the needs assessment survey. Two 3-hour training workshops were delivered to 30 participants. To assess the outcomes of the EBMgt educational program, pre- and post-tests and self-reflection surveys were conducted. The data were collected and analyzed descriptively and inferentially, using the IBM Statistical Package for the Social Sciences (SPSS) 22.0. A paired sample t-test was performed to compare the differences in participants' EBMgt knowledge and skills prior to and after the training. The assessment results showed that there was a statistically significant difference in participants' EBMgt knowledge and information searching skills before and after the training (p< 0.001). Participants' confidence in using the EBMgt approach for decision-making was significantly increased after the training workshops (p< 0.001). Eighty-three percent of participants indicated that the knowledge and skills they gained through the training program could be used for future management decision-making in their healthcare organizations. The overall evaluation results of the program were positive. It is suggested that the logic model is a useful tool for program planning, implementation, and evaluation, and it also improves the outcomes of the educational program.

  15. MELD: A Logical Approach to Distributed and Parallel Programming

    DTIC Science & Technology

    2012-03-01

    0215 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR(S) Seth Copen Goldstein Flavio Cruz 5d. PROJECT NUMBER BI20 5e. TASK...Comp. Sci., vol. 50, pp. 1–102, 1987. [33] P. Ló pez, F. Pfenning, J. Polakow, and K. Watkins , “Monadic concurrent linear logic programming,” in

  16. Computer Aided Wirewrap Interconnect.

    DTIC Science & Technology

    1980-11-01

    ECLI (180 MHz System Clock Generated via Ring Oscillator) Clock Waveform: Synchronous Phase 0 Output Binary Counter: Power Plane Noie: (Loaded) LSB...LOGIC (ECL) (185 MHz System Clock Generated via Ring Oscillator) Clock Woveform Synchronous Phase 0 Output Binary Counter- Power Plane Voise (Loaded...High Speed .. ......... . 98 Clock Signals Into Logic Panels in a Multiboard System On-Eoard Clock Distribution Via Fanout .... ......... 102 Through

  17. Quantum Communication without Alignment using Multiple-Qubit Single-Photon States

    NASA Astrophysics Data System (ADS)

    Aolita, L.; Walborn, S. P.

    2007-03-01

    We propose a scheme for encoding logical qubits in a subspace protected against collective rotations around the propagation axis using the polarization and transverse spatial degrees of freedom of single photons. This encoding allows for quantum key distribution without the need of a shared reference frame. We present methods to generate entangled states of two logical qubits using present day down-conversion sources and linear optics, and show that the application of these entangled logical states to quantum information schemes allows for alignment-free tests of Bell’s inequalities, quantum dense coding, and quantum teleportation.

  18. The Quantification of Consistent Subjective Logic Tree Branch Weights for PSHA

    NASA Astrophysics Data System (ADS)

    Runge, A. K.; Scherbaum, F.

    2012-04-01

    The development of quantitative models for the rate of exceedance of seismically generated ground motion parameters is the target of probabilistic seismic hazard analysis (PSHA). In regions of low to moderate seismicity, the selection and evaluation of source- and/or ground-motion models is often a major challenge to hazard analysts and affected by large epistemic uncertainties. In PSHA this type of uncertainties is commonly treated within a logic tree framework in which the branch weights express the degree-of-belief values of an expert in the corresponding set of models. For the calculation of the distribution of hazard curves, these branch weights are subsequently used as subjective probabilities. However the quality of the results depends strongly on the "quality" of the expert knowledge. A major challenge for experts in this context is to provide weight estimates which are logically consistent (in the sense of Kolmogorov's axioms) and to be aware of and to deal with the multitude of heuristics and biases which affect human judgment under uncertainty. For example, people tend to give smaller weights to each branch of a logic tree the more branches it has, starting with equal weights for all branches and then adjusting this uniform distribution based on his/her beliefs about how the branches differ. This effect is known as pruning bias.¹ A similar unwanted effect, which may even wrongly suggest robustness of the corresponding hazard estimates, will appear in cases where all models are first judged according to some numerical quality measure approach and the resulting weights are subsequently normalized to sum up to one.2 To address these problems, we have developed interactive graphical tools for the determination of logic tree branch weights in form of logically consistent subjective probabilities, based on the concepts suggested in Curtis and Wood (2004).3 Instead of determining the set of weights for all the models in a single step, the computer driven elicitation process is performed as a sequence of evaluations of relative weights for small subsets of models which are presented to the analyst. From these, the distribution of logic tree weights for the whole model set is determined as solution of an optimization problem. The model subset presented to the analyst in each step is designed to maximize the expected information. The result of this process is a set of logically consistent weights together with a measure of confidence determined from the amount of conflicting information which is provided by the expert during the relative weighting process.

  19. Objective analysis of observational data from the FGGE observing systems

    NASA Technical Reports Server (NTRS)

    Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.

    1981-01-01

    An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.

  20. N channel JFET based digital logic gate structure

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael J. (Inventor)

    2010-01-01

    A circuit topography is presented which is used to create usable digital logic gates using N (negatively doped) channel Junction Field Effect Transistors (JFETs) and load resistors, level shifting resistors, and supply rails whose values are based on the direct current (DC) parametric distributions of those JFETs. This method has direct application to the current state of the art in high temperature, for example 300.degree. C. to 500.degree. C. and higher, silicon carbide (SiC) device production. The ability to produce inverting and combinatorial logic enables the production of pulse and edge triggered latches. This scale of logic synthesis would bring digital logic and state machine capabilities to devices operating in extremely hot environments, such as the surface of Venus, near hydrothermal vents, within nuclear reactors (SiC is inherently radiation hardened), and within internal combustion engines. The basic logic gate can be configured as a driver for oscillator circuits allowing for time bases and simple digitizers for resistive or reactive sensors. The basic structure of this innovation, the inverter, can be reconfigured into various analog circuit topographies through the use of feedback structures.

  1. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  2. Engineering evaluations and studies. Report for Ku-band studies, exhibit A

    NASA Technical Reports Server (NTRS)

    Dodds, J. G.; Huth, G. K.; Maronde, R. G.; Roberts, D.

    1981-01-01

    System performance aspects of the Ku band radar communication hardware and investigations into the Ku band/payload interfaces are discussed. The communications track problem caused by the excessive signal dynamic range at the servo input was investigated. The management/handover logic is discussed and a simplified description of the transmitter enable logic function is presented. Output noise produced by a voltage-controlled oscillator chip used in the SPA return-link channel 3 mid-bit detector is discussed. The deployed assembly (DA) and EA-2 critical design review data are evaluated. Cross coupling effects on antenna servo stability were examined. A series of meetings on the acceptance test specification for the deployed assembly is summarized.

  3. Introducing Online Bibliographic Service to its Users: The Online Presentation

    ERIC Educational Resources Information Center

    Crane, Nancy B.; Pilachowski, David M.

    1978-01-01

    A description of techniques for introducing online services to new user groups includes discussion of terms and their definitions, evolution of online searching, advantages and disadvantages of online searching, production of the data bases, search strategies, Boolean logic, costs and charges, "do's and don'ts," and a user search questionnaire. (J…

  4. The Information Bazaar; Sixth Annual National Colloquium on Information Retrieval, May 8-9, 1969, Philadelphia, Pennsylvania.

    ERIC Educational Resources Information Center

    Schultz, Louise, Ed.

    The 31 papers in this proceedings cover social as well as technical issues, mathematical models and formal logic systems, applications descriptions, program design, cost analysis, and predictions. The papers are grouped into sessions including: privacy and information technology, describing documents, information dissemination systems, public and…

  5. The Development of a Model for Designing Carrel Experiences for Science Students.

    ERIC Educational Resources Information Center

    Russell, James Douglas

    A description of the systems approach to designing of carrel experiences for science students is presented to provide a logical sequence and structure for instructional decisions. A brief historical discussion dating from 1961 and Postlethwait's work at Purdue University is given, and a rationale for the carrel approach is provided. The…

  6. Everyday Routines: A Window into the Cultural Organization of Family Child Care

    ERIC Educational Resources Information Center

    Tonyan, Holli A.

    2015-01-01

    Eco(logical)-cultural Theory suggests that a daily routine results from individuals adapting cultural ideas to the constraints of a local context or ecology. Using Ecocultural Theory, this research examined family child care providers' descriptions of daily activities and overall approach to understand cultural models. The results highlighted a…

  7. A Preliminary Report on the PLATO V Terminal.

    ERIC Educational Resources Information Center

    Stifle, J. E.

    This report is a preliminary description of a prototype of a second generation version of the PLATO IV (Programmed Logic for Automated Teaching Operations) student terminal. Development of a new terminal has been pursued with two objectives: to generate a more economic version of the PLATO IV terminal, and to expand capacities and performance of…

  8. A Partial Theory of Executive Succession.

    ERIC Educational Resources Information Center

    Thiemann, Francis C.

    This study has two purposes: (1) To construct a partial theory of succession, and (2) to utilize a method of theory construction which combines some of the concepts of Hans Zetterberg with the principles of formal symbolic logic. A bibliography on succession in complex organizations with entries on descriptive and empirical studies from various…

  9. A Variational Framework for Exemplar-Based Image Inpainting

    DTIC Science & Technology

    2010-04-01

    Physical Review 106(4), 620–30 (1957) 37. Jia, J., Tang, C.K.: Inference of segmented color and texture description by tensor voting . IEEE Trans. on PAMI 26...use of other patch error functions based on the comparison of structure tensors , which could provide a more robust estimation of the morpho- logical

  10. Captain's Log...The Speech Communication Oral Journal.

    ERIC Educational Resources Information Center

    Strong, William F.

    1983-01-01

    The logic and the benefits of requiring college students in basic speech communication classes to tape-record oral journals are set forth along with a detailed description of the assignment. Instructions to the students explain the mechanics of the assignment as follows: (1) obtain and properly label a quality cassette tape; (2) make seven…

  11. A Descriptive Study Examining the Impact of Digital Writing Environments on Communication and Mathematical Reasoning for Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Huscroft-D'Angelo, Jacqueline; Higgins, Kristina N.; Crawford, Lindy L.

    2014-01-01

    Proficiency in mathematics, including mathematical reasoning skills, requires students to communicate their mathematical thinking. Mathematical reasoning involves making sense of mathematical concepts in a logical way to form conclusions or judgments, and is often underdeveloped in students with learning disabilities. Technology-based environments…

  12. Decision support system for nursing management control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernst, C.J.

    A knowledge representation approach for expert systems supporting decision processes in business is proposed. A description of a knowledge representation schema using a logic programming metalanguage is described, then the role of such a schema in a management expert system is demonstrated through the problem of nursing management control in hospitals. 18 references.

  13. Genetic algorithm optimized rainfall-runoff fuzzy inference system for row crop watersheds with claypan soils

    USDA-ARS?s Scientific Manuscript database

    The fuzzy logic algorithm has the ability to describe knowledge in a descriptive human-like manner in the form of simple rules using linguistic variables, and provides a new way of modeling uncertain or naturally fuzzy hydrological processes like non-linear rainfall-runoff relationships. Fuzzy infe...

  14. Towards a Consistent and Scientifically Accurate Drug Ontology.

    PubMed

    Hogan, William R; Hanna, Josh; Joseph, Eric; Brochhausen, Mathias

    2013-01-01

    Our use case for comparative effectiveness research requires an ontology of drugs that enables querying National Drug Codes (NDCs) by active ingredient, mechanism of action, physiological effect, and therapeutic class of the drug products they represent. We conducted an ontological analysis of drugs from the realist perspective, and evaluated existing drug terminology, ontology, and database artifacts from (1) the technical perspective, (2) the perspective of pharmacology and medical science (3) the perspective of description logic semantics (if they were available in Web Ontology Language or OWL), and (4) the perspective of our realism-based analysis of the domain. No existing resource was sufficient. Therefore, we built the Drug Ontology (DrOn) in OWL, which we populated with NDCs and other classes from RxNorm using only content created by the National Library of Medicine. We also built an application that uses DrOn to query for NDCs as outlined above, available at: http://ingarden.uams.edu/ingredients. The application uses an OWL-based description logic reasoner to execute end-user queries. DrOn is available at http://code.google.com/p/dr-on.

  15. The distribution of individual cabinet positions in coalition governments: A sequential approach

    PubMed Central

    Meyer, Thomas M.; Müller, Wolfgang C.

    2015-01-01

    Abstract Multiparty government in parliamentary democracies entails bargaining over the payoffs of government participation, in particular the allocation of cabinet positions. While most of the literature deals with the numerical distribution of cabinet seats among government parties, this article explores the distribution of individual portfolios. It argues that coalition negotiations are sequential choice processes that begin with the allocation of those portfolios most important to the bargaining parties. This induces conditionality in the bargaining process as choices of individual cabinet positions are not independent of each other. Linking this sequential logic with party preferences for individual cabinet positions, the authors of the article study the allocation of individual portfolios for 146 coalition governments in Western and Central Eastern Europe. The results suggest that a sequential logic in the bargaining process results in better predictions than assuming mutual independence in the distribution of individual portfolios. PMID:27546952

  16. Droplet Sizing Research Program.

    DTIC Science & Technology

    1986-03-10

    of size and velocity distributions is needed. For example, fuel spray studies, aer- osol studies, flue gas desulfurization , spray drying, paint...techniques are presented chronologic- ally since there is a logical development as a function of time. Most of the significant technical accomplishments...U3U 0 0 ILI N signals with an apparently different size by using the following logic : droplets that produce a certain visibility are associated with a

  17. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.

    PubMed

    Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-09-23

    SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs.

  18. Coastal vulnerability assessment using Fuzzy Logic and Bayesian Belief Network approaches

    NASA Astrophysics Data System (ADS)

    Valentini, Emiliana; Nguyen Xuan, Alessandra; Filipponi, Federico; Taramelli, Andrea

    2017-04-01

    Natural hazards such as sea surge are threatening low-lying coastal plains. In order to deal with disturbances a deeper understanding of benefits deriving from ecosystem services assessment, management and planning can contribute to enhance the resilience of coastal systems. In this frame assessing current and future vulnerability is a key concern of many Systems Of Systems SOS (social, ecological, institutional) that deals with several challenges like the definition of Essential Variables (EVs) able to synthesize the required information, the assignment of different weight to be attributed to each considered variable, the selection of method for combining the relevant variables. It is widely recognized that ecosystems contribute to human wellbeing and then their conservation increases the resilience capacities and could play a key role in reducing climate related risk and thus physical and economic losses. A way to fully exploit ecosystems potential, i.e. their so called ecopotential (see H2020 EU funded project "ECOPOTENTIAL"), is the Ecosystem based Adaptation (EbA): the use of ecosystem services as part of an adaptation strategy. In order to provide insight in understanding regulating ecosystem services to surge and which variables influence them and to make the best use of available data and information (EO products, in situ data and modelling), we propose a multi-component surge vulnerability assessment, focusing on coastal sandy dunes as natural barriers. The aim is to combine together eco-geomorphological and socio-economic variables with the hazard component on the base of different approaches: 1) Fuzzy Logic; 2) Bayesian Belief Networks (BBN). The Fuzzy Logic approach is very useful to get a spatialized information and it can easily combine variables coming from different sources. It provides information on vulnerability moving along-shore and across-shore (beach-dune transect), highlighting the variability of vulnerability conditions in the spatial dimension. According to the results using fuzzy operators, the analysis greatest weakness is the limited capacity to represent the relation among the different considered variables. The BBN approach, based on the definition of conditional probabilities, has allowed determining the trend of distributions of vulnerability along-shore, highlighting which parts of the coast are most likely to have higher or lower vulnerability than others. In BBN analysis, the greatest weakness emerge in the case of arbitrary definition of conditional probabilities (i.e. when there is a lack of information on the past hazardous events) because it is not possible to derive the individual contribution of each variable. As conclusion, the two approaches could be used together in the perspective of enhancing the multiple components in vulnerability assessment: the BBN as a preliminary assessment to provide a coarse description of the vulnerability distribution, and the Fuzzy Logic as an extended assessment to provide more space based information.

  19. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  20. A reconfigurable NAND/NOR genetic logic gate

    PubMed Central

    2012-01-01

    Background Engineering genetic Boolean logic circuits is a major research theme of synthetic biology. By altering or introducing connections between genetic components, novel regulatory networks are built in order to mimic the behaviour of electronic devices such as logic gates. While electronics is a highly standardized science, genetic logic is still in its infancy, with few agreed standards. In this paper we focus on the interpretation of logical values in terms of molecular concentrations. Results We describe the results of computational investigations of a novel circuit that is able to trigger specific differential responses depending on the input standard used. The circuit can therefore be dynamically reconfigured (without modification) to serve as both a NAND/NOR logic gate. This multi-functional behaviour is achieved by a) varying the meanings of inputs, and b) using branch predictions (as in computer science) to display a constrained output. A thorough computational study is performed, which provides valuable insights for the future laboratory validation. The simulations focus on both single-cell and population behaviours. The latter give particular insights into the spatial behaviour of our engineered cells on a surface with a non-homogeneous distribution of inputs. Conclusions We present a dynamically-reconfigurable NAND/NOR genetic logic circuit that can be switched between modes of operation via a simple shift in input signal concentration. The circuit addresses important issues in genetic logic that will have significance for more complex synthetic biology applications. PMID:22989145

  1. A reconfigurable NAND/NOR genetic logic gate.

    PubMed

    Goñi-Moreno, Angel; Amos, Martyn

    2012-09-18

    Engineering genetic Boolean logic circuits is a major research theme of synthetic biology. By altering or introducing connections between genetic components, novel regulatory networks are built in order to mimic the behaviour of electronic devices such as logic gates. While electronics is a highly standardized science, genetic logic is still in its infancy, with few agreed standards. In this paper we focus on the interpretation of logical values in terms of molecular concentrations. We describe the results of computational investigations of a novel circuit that is able to trigger specific differential responses depending on the input standard used. The circuit can therefore be dynamically reconfigured (without modification) to serve as both a NAND/NOR logic gate. This multi-functional behaviour is achieved by a) varying the meanings of inputs, and b) using branch predictions (as in computer science) to display a constrained output. A thorough computational study is performed, which provides valuable insights for the future laboratory validation. The simulations focus on both single-cell and population behaviours. The latter give particular insights into the spatial behaviour of our engineered cells on a surface with a non-homogeneous distribution of inputs. We present a dynamically-reconfigurable NAND/NOR genetic logic circuit that can be switched between modes of operation via a simple shift in input signal concentration. The circuit addresses important issues in genetic logic that will have significance for more complex synthetic biology applications.

  2. The architecture of a virtual grid GIS server

    NASA Astrophysics Data System (ADS)

    Wu, Pengfei; Fang, Yu; Chen, Bin; Wu, Xi; Tian, Xiaoting

    2008-10-01

    The grid computing technology provides the service oriented architecture for distributed applications. The virtual Grid GIS server is the distributed and interoperable enterprise application GIS architecture running in the grid environment, which integrates heterogeneous GIS platforms. All sorts of legacy GIS platforms join the grid as members of GIS virtual organization. Based on Microkernel we design the ESB and portal GIS service layer, which compose Microkernel GIS. Through web portals, portal GIS services and mediation of service bus, following the principle of SoC, we separate business logic from implementing logic. Microkernel GIS greatly reduces the coupling degree between applications and GIS platforms. The enterprise applications are independent of certain GIS platforms, and making the application developers to pay attention to the business logic. Via configuration and orchestration of a set of fine-grained services, the system creates GIS Business, which acts as a whole WebGIS request when activated. In this way, the system satisfies a business workflow directly and simply, with little or no new code.

  3. (Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare.

    PubMed

    Mannion, Russell; Exworthy, Mark

    2017-03-28

    Recent years have witnessed a parallel and seemingly contradictory trend towards both the standardization and the customization of healthcare and medical treatment. Here, we explore what is meant by 'standardization' and 'customization' in healthcare settings and explore the implications of these changes for healthcare delivery. We frame the paradox of these divergent and opposing factors in terms of institutional logics - the socially constructed rules, practices and beliefs which perpetuate institutional behaviour. As the tension between standardization and customization is fast becoming a critical fault-line within many health systems, there remains an urgent need for more sustained work exploring how these competing logics are articulated, adapted, resisted and co-exist on the front line of care delivery. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  4. Slave finite element for non-linear analysis of engine structures. Volume 2: Programmer's manual and user's manual

    NASA Technical Reports Server (NTRS)

    Witkop, D. L.; Dale, B. J.; Gellin, S.

    1991-01-01

    The programming aspects of SFENES are described in the User's Manual. The information presented is provided for the installation programmer. It is sufficient to fully describe the general program logic and required peripheral storage. All element generated data is stored externally to reduce required memory allocation. A separate section is devoted to the description of these files thereby permitting the optimization of Input/Output (I/O) time through efficient buffer descriptions. Individual subroutine descriptions are presented along with the complete Fortran source listings. A short description of the major control, computation, and I/O phases is included to aid in obtaining an overall familiarity with the program's components. Finally, a discussion of the suggested overlay structure which allows the program to execute with a reasonable amount of memory allocation is presented.

  5. Description logic-based methods for auditing frame-based medical terminological systems.

    PubMed

    Cornet, Ronald; Abu-Hanna, Ameen

    2005-07-01

    Medical terminological systems (TSs) play an increasingly important role in health care by supporting recording, retrieval and analysis of patient information. As the size and complexity of TSs are growing, the need arises for means to audit them, i.e. verify and maintain (logical) consistency and (semantic) correctness of their contents. This is not only important for the management of TSs but also for providing their users with confidence about the reliability of their contents. Formal methods have the potential to play an important role in the audit of TSs, although there are few empirical studies to assess the benefits of using these methods. In this paper we propose a method based on description logics (DLs) for the audit of TSs. This method is based on the migration of the medical TS from a frame-based representation to a DL-based one. Our method is characterized by a process in which initially stringent assumptions are made about concept definitions. The assumptions allow the detection of concepts and relations that might comprise a source of logical inconsistency. If the assumptions hold then definitions are to be altered to eliminate the inconsistency, otherwise the assumptions are revised. In order to demonstrate the utility of the approach in a real-world case study we audit a TS in the intensive care domain and discuss decisions pertaining to building DL-based representations. This case study demonstrates that certain types of inconsistencies can indeed be detected by applying the method to a medical terminological system. The added value of the method described in this paper is that it provides a means to evaluate the compliance to a number of common modeling principles in a formal manner. The proposed method reveals potential modeling inconsistencies, helping to audit and (if possible) improve the medical TS. In this way, it contributes to providing confidence in the contents of the terminological system.

  6. Army Stock Positioning: How Can Distribution Performance Be Improved

    DTIC Science & Technology

    2017-01-01

    Support RAND Make a tax -deductible charitable contribution at www.rand.org/giving/contribute www.rand.org Preface The...the source- preference logic that selects the warehouse for issuing an item to a cus- tomer. Below, we describe the LMP source-preference logic and...the AMC pilot to use it to tailor how warehouses are selected to issue items to customers. AMC Pilot Implementation of LMP Source-Preference

  7. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  8. Competing Logics and Healthcare Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Saks, Mike

    2017-08-20

    This paper offers a short commentary on the editorial by Mannion and Exworthy. The paper highlights the positive insights offered by their analysis into the tensions between the competing institutional logics of standardization and customization in healthcare, in part manifested in the conflict between managers and professionals, and endorses the plea of the authors for further research in this field. However, the editorial is criticized for its lack of a strong societal reference point, the comparative absence of focus on hybridization, and its failure to highlight structural factors impinging on the opposing logics in a broader neo-institutional framework. With reference to the Procrustean metaphor, it is argued that greater stress should be placed on the healthcare user in future health policy. Finally, the case of complementary and alternative medicine is set out which - while not explicitly mentioned in the editorial - most effectively concretizes the tensions at the heart of this analysis of healthcare. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  9. Short circuit protection for a power distribution system

    NASA Technical Reports Server (NTRS)

    Owen, J. R., III

    1969-01-01

    Sensing circuit detects when the output from a matrix is present and when it should be present. The circuit provides short circuit protection for a power distribution system where the selection of the driven load is accomplished by digital logic.

  10. A Fault Tree Approach to Analysis of Behavioral Systems: An Overview.

    ERIC Educational Resources Information Center

    Stephens, Kent G.

    Developed at Brigham Young University, Fault Tree Analysis (FTA) is a technique for enhancing the probability of success in any system by analyzing the most likely modes of failure that could occur. It provides a logical, step-by-step description of possible failure events within a system and their interaction--the combinations of potential…

  11. Analysis of Institutional Competitiveness of Junior High Schools through the Admission Test to High School Education

    ERIC Educational Resources Information Center

    Armendáriz, Joyzukey; Tarango, Javier; Machin-Mastromatteo, Juan Daniel

    2018-01-01

    This descriptive and correlational research studies 15,658 students from 335 secondary schools in the state of Chihuahua, Mexico, through the results of the examination of admission to high school education (National High School Admission Test--EXANI I from the National Assessment Center for Education--CENEVAL) on logical-mathematical and verbal…

  12. Ethics Education for the Family Psychologist: Who Is the Client?

    ERIC Educational Resources Information Center

    Schneider, Lawrence J.

    W. G. Perry (1970) formulated a description of stages of intellectual and ethical development. Perry's schema seems to have applicability in describing trainees as they approach working with families and gauging counselor trainees' level of progress. The first stage is "dualism" in which trainees rely primarily on the use of logic and the weight…

  13. Interpretation of Verb Phrase Telicity: Sensitivity to Verb Type and Determiner Type

    ERIC Educational Resources Information Center

    Ogiela, Diane A.; Schmitt, Cristina; Casby, Michael W.

    2014-01-01

    Purpose: The authors examine how adults use linguistic information from verbs, direct objects, and particles to interpret an event description as encoding a logical endpoint to the event described (in which case, it is telic) or not (in which case, it is atelic). Current models of aspectual composition predict that quantity-sensitive verbs…

  14. Concept for a Satellite-Based Advanced Air Traffic Management System : Volume 8. Operational Logic Flow Diagrams for a Generic Advanced Air Traffic Management system

    DOT National Transportation Integrated Search

    1974-02-01

    The volume presents a description of the services a generic Advanced Air Traffic Management System (AATMS) should provide to the useres of the system to facilitate the safe, efficient flow of traffic. It provides a definition of the functions which t...

  15. Effective Leadership Behaviors for Child Care Administrators: Seeking Quality Measurement System Success

    ERIC Educational Resources Information Center

    Robertson, Rachel

    2011-01-01

    Among quality measurement systems, there is no clear description of how administrators are expected to move through the process. This is not necessarily a fault of the systems; it is not their intention to script a program's process. Yes, there are many tasks that are logically the administrator's responsibility--important things that must get…

  16. Hardware synthesis from DDL. [Digital Design Language for computer aided design and test of LSI

    NASA Technical Reports Server (NTRS)

    Shah, A. M.; Shiva, S. G.

    1981-01-01

    The details of the digital systems can be conveniently input into the design automation system by means of Hardware Description Languages (HDL). The Computer Aided Design and Test (CADAT) system at NASA MSFC is used for the LSI design. The Digital Design Language (DDL) has been selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. This paper addresses problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system.

  17. Centralized and distributed control architectures under Foundation Fieldbus network.

    PubMed

    Persechini, Maria Auxiliadora Muanis; Jota, Fábio Gonçalves

    2013-01-01

    This paper aims at discussing possible automation and control system architectures based on fieldbus networks in which the controllers can be implemented either in a centralized or in a distributed form. An experimental setup is used to demonstrate some of the addressed issues. The control and automation architecture is composed of a supervisory system, a programmable logic controller and various other devices connected to a Foundation Fieldbus H1 network. The procedures used in the network configuration, in the process modelling and in the design and implementation of controllers are described. The specificities of each one of the considered logical organizations are also discussed. Finally, experimental results are analysed using an algorithm for the assessment of control loops to compare the performances between the centralized and the distributed implementations. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  18. The New Quantum Logic

    NASA Astrophysics Data System (ADS)

    Griffiths, Robert B.

    2014-06-01

    It is shown how all the major conceptual difficulties of standard (textbook) quantum mechanics, including the two measurement problems and the (supposed) nonlocality that conflicts with special relativity, are resolved in the consistent or decoherent histories interpretation of quantum mechanics by using a modified form of quantum logic to discuss quantum properties (subspaces of the quantum Hilbert space), and treating quantum time development as a stochastic process. The histories approach in turn gives rise to some conceptual difficulties, in particular the correct choice of a framework (probabilistic sample space) or family of histories, and these are discussed. The central issue is that the principle of unicity, the idea that there is a unique single true description of the world, is incompatible with our current understanding of quantum mechanics.

  19. Convergent method of and apparatus for distributed control of robotic systems using fuzzy logic

    DOEpatents

    Feddema, John T.; Driessen, Brian J.; Kwok, Kwan S.

    2002-01-01

    A decentralized fuzzy logic control system for one vehicle or for multiple robotic vehicles provides a way to control each vehicle to converge on a goal without collisions between vehicles or collisions with other obstacles, in the presence of noisy input measurements and a limited amount of compute-power and memory on board each robotic vehicle. The fuzzy controller demonstrates improved robustness to noise relative to an exact controller.

  20. Application of SEU imaging for analysis of device architecture using a 25 MeV/u 86Kr ion microbeam at HIRFL

    NASA Astrophysics Data System (ADS)

    Liu, Tianqi; Yang, Zhenlei; Guo, Jinlong; Du, Guanghua; Tong, Teng; Wang, Xiaohui; Su, Hong; Liu, Wenjing; Liu, Jiande; Wang, Bin; Ye, Bing; Liu, Jie

    2017-08-01

    The heavy-ion imaging of single event upset (SEU) in a flash-based field programmable gate array (FPGA) device was carried out for the first time at Heavy Ion Research Facility in Lanzhou (HIRFL). The three shift register chains with separated input and output configurations in device under test (DUT) were used to identify the corresponding logical area rapidly once an upset occurred. The logic units in DUT were partly configured in order to distinguish the registers in SEU images. Based on the above settings, the partial architecture of shift register chains in DUT was imaged by employing the microbeam of 86Kr ion with energy of 25 MeV/u in air. The results showed that the physical distribution of registers in DUT had a high consistency with its logical arrangement by comparing SEU image with logic configuration in scanned area.

  1. Fuzzy Energy Management for a Catenary-Battery-Ultracapacitor based Hybrid Tramway

    NASA Astrophysics Data System (ADS)

    Jibin, Yang; Jiye, Zhang; Pengyun, Song

    2017-05-01

    In this paper, an energy management strategy (EMS) based on fuzzy logic control for a catenary-battery-ultracapacitor powered hybrid modern tramway was presented. The fuzzy logic controller for the catenary zone and catenary-less zone was respectively designed by analyzing the structure and working mode of the hybrid system, then an energy management strategy based on double fuzzy logic control was proposed to enhance the fuel economy. The hybrid modern tramway simulation model was developed based on MATLAB/Simulink environment. The simulation results show that the proposed EMS can satisfy the demand of dynamic performance of the tramway and achieve the power distribution reasonably between the each power source.

  2. PLQP & Company: Decidable Logics for Quantum Algorithms

    NASA Astrophysics Data System (ADS)

    Baltag, Alexandru; Bergfeld, Jort; Kishida, Kohei; Sack, Joshua; Smets, Sonja; Zhong, Shengyang

    2014-10-01

    We introduce a probabilistic modal (dynamic-epistemic) quantum logic PLQP for reasoning about quantum algorithms. We illustrate its expressivity by using it to encode the correctness of the well-known quantum search algorithm, as well as of a quantum protocol known to solve one of the paradigmatic tasks from classical distributed computing (the leader election problem). We also provide a general method (extending an idea employed in the decidability proof in Dunn et al. (J. Symb. Log. 70:353-359, 2005)) for proving the decidability of a range of quantum logics, interpreted on finite-dimensional Hilbert spaces. We give general conditions for the applicability of this method, and in particular we apply it to prove the decidability of PLQP.

  3. Quantum dot-based local field imaging reveals plasmon-based interferometric logic in silver nanowire networks.

    PubMed

    Wei, Hong; Li, Zhipeng; Tian, Xiaorui; Wang, Zhuoxian; Cong, Fengzi; Liu, Ning; Zhang, Shunping; Nordlander, Peter; Halas, Naomi J; Xu, Hongxing

    2011-02-09

    We show that the local electric field distribution of propagating plasmons along silver nanowires can be imaged by coating the nanowires with a layer of quantum dots, held off the surface of the nanowire by a nanoscale dielectric spacer layer. In simple networks of silver nanowires with two optical inputs, control of the optical polarization and phase of the input fields directs the guided waves to a specific nanowire output. The QD-luminescent images of these structures reveal that a complete family of phase-dependent, interferometric logic functions can be performed on these simple networks. These results show the potential for plasmonic waveguides to support compact interferometric logic operations.

  4. Learning a Markov Logic network for supervised gene regulatory network inference

    PubMed Central

    2013-01-01

    Background Gene regulatory network inference remains a challenging problem in systems biology despite the numerous approaches that have been proposed. When substantial knowledge on a gene regulatory network is already available, supervised network inference is appropriate. Such a method builds a binary classifier able to assign a class (Regulation/No regulation) to an ordered pair of genes. Once learnt, the pairwise classifier can be used to predict new regulations. In this work, we explore the framework of Markov Logic Networks (MLN) that combine features of probabilistic graphical models with the expressivity of first-order logic rules. Results We propose to learn a Markov Logic network, e.g. a set of weighted rules that conclude on the predicate “regulates”, starting from a known gene regulatory network involved in the switch proliferation/differentiation of keratinocyte cells, a set of experimental transcriptomic data and various descriptions of genes all encoded into first-order logic. As training data are unbalanced, we use asymmetric bagging to learn a set of MLNs. The prediction of a new regulation can then be obtained by averaging predictions of individual MLNs. As a side contribution, we propose three in silico tests to assess the performance of any pairwise classifier in various network inference tasks on real datasets. A first test consists of measuring the average performance on balanced edge prediction problem; a second one deals with the ability of the classifier, once enhanced by asymmetric bagging, to update a given network. Finally our main result concerns a third test that measures the ability of the method to predict regulations with a new set of genes. As expected, MLN, when provided with only numerical discretized gene expression data, does not perform as well as a pairwise SVM in terms of AUPR. However, when a more complete description of gene properties is provided by heterogeneous sources, MLN achieves the same performance as a black-box model such as a pairwise SVM while providing relevant insights on the predictions. Conclusions The numerical studies show that MLN achieves very good predictive performance while opening the door to some interpretability of the decisions. Besides the ability to suggest new regulations, such an approach allows to cross-validate experimental data with existing knowledge. PMID:24028533

  5. Learning a Markov Logic network for supervised gene regulatory network inference.

    PubMed

    Brouard, Céline; Vrain, Christel; Dubois, Julie; Castel, David; Debily, Marie-Anne; d'Alché-Buc, Florence

    2013-09-12

    Gene regulatory network inference remains a challenging problem in systems biology despite the numerous approaches that have been proposed. When substantial knowledge on a gene regulatory network is already available, supervised network inference is appropriate. Such a method builds a binary classifier able to assign a class (Regulation/No regulation) to an ordered pair of genes. Once learnt, the pairwise classifier can be used to predict new regulations. In this work, we explore the framework of Markov Logic Networks (MLN) that combine features of probabilistic graphical models with the expressivity of first-order logic rules. We propose to learn a Markov Logic network, e.g. a set of weighted rules that conclude on the predicate "regulates", starting from a known gene regulatory network involved in the switch proliferation/differentiation of keratinocyte cells, a set of experimental transcriptomic data and various descriptions of genes all encoded into first-order logic. As training data are unbalanced, we use asymmetric bagging to learn a set of MLNs. The prediction of a new regulation can then be obtained by averaging predictions of individual MLNs. As a side contribution, we propose three in silico tests to assess the performance of any pairwise classifier in various network inference tasks on real datasets. A first test consists of measuring the average performance on balanced edge prediction problem; a second one deals with the ability of the classifier, once enhanced by asymmetric bagging, to update a given network. Finally our main result concerns a third test that measures the ability of the method to predict regulations with a new set of genes. As expected, MLN, when provided with only numerical discretized gene expression data, does not perform as well as a pairwise SVM in terms of AUPR. However, when a more complete description of gene properties is provided by heterogeneous sources, MLN achieves the same performance as a black-box model such as a pairwise SVM while providing relevant insights on the predictions. The numerical studies show that MLN achieves very good predictive performance while opening the door to some interpretability of the decisions. Besides the ability to suggest new regulations, such an approach allows to cross-validate experimental data with existing knowledge.

  6. Contextualizing Distributed Leadership in Higher Education

    ERIC Educational Resources Information Center

    Sewerin, Thomas; Holmberg, Robert

    2017-01-01

    This case study of development in a technical university situates distributed leadership in higher education in an organizational perspective. Analysis of documentation from development programs and interviews with 10 faculty members showed that leadership practices were related to different institutional logics prominent in four key activities in…

  7. Towards an ontological representation of morbidity and mortality in Description Logics.

    PubMed

    Santana, Filipe; Freitas, Fred; Fernandes, Roberta; Medeiros, Zulma; Schober, Daniel

    2012-09-21

    Despite the high coverage of biomedical ontologies, very few sound definitions of death can be found. Nevertheless, this concept has its relevance in epidemiology, such as for data integration within mortality notification systems. We here introduce an ontological representation of the complex biological qualities and processes that inhere in organisms transitioning from life to death. We further characterize them by causal processes and their temporal borders. Several representational difficulties were faced, mainly regarding kinds of processes with blurred or fiat borders that change their type in a continuous rather than discrete mode. Examples of such hard to grasp concepts are life, death and its relationships with injuries and diseases. We illustrate an iterative optimization of definitions within four versions of the ontology, so as to stress the typical problems encountered in representing complex biological processes. We point out possible solutions for representing concepts related to biological life cycles, preserving identity of participating individuals, i.e. for a patient in transition from life to death. This solution however required the use of extended description logics not yet supported by tools. We also focus on the interdependencies and need to change further parts if one part is changed. The axiomatic definition of mortality we introduce allows the description of biologic processes related to the transition from healthy to diseased or injured, and up to a final death state. Exploiting such definitions embedded into descriptions of pathogen transmissions by arthropod vectors, the complete sequence of infection and disease processes can be described, starting from the inoculation of a pathogen by a vector, until the death of an individual, preserving the identity of the patient.

  8. Program manual for the Shuttle Electric Power System analysis computer program (SEPS), volume 1 of program documentation

    NASA Technical Reports Server (NTRS)

    Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.

    1974-01-01

    The Shuttle Electric Power System (SEPS) computer program is considered in terms of the program manual, programmer guide, and program utilization. The main objective is to provide the information necessary to interpret and use the routines comprising the SEPS program. Subroutine descriptions including the name, purpose, method, variable definitions, and logic flow are presented.

  9. Investigating Validity of Math 105 as Prerequisite to Math 201 among Undergraduate Students, Nigeria

    ERIC Educational Resources Information Center

    Zakariya, Yusuf F.

    2016-01-01

    In this study, the author examined the validity of MATH 105 as a prerequisite to MATH 201. The data for this study was extracted directly from the examination results logic of the university. Descriptive statistics in form of correlations and linear regressions were used to analyze the obtained data. Three research questions were formulated and…

  10. Instruction manual model 600F, data transmission test set

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Information necessary for the operation and maintenance of the Model 600F Data Transmission Test Set is presented. A description is contained of the physical and functional characteristics; pertinent installation data; instructions for operating the equipment; general and detailed principles of operation; preventive and corrective maintenance procedures; and block, logic, and component layout diagrams of the equipment and its major component assemblies.

  11. Game-Based Learning: Increasing the Logical-Mathematical, Naturalistic, and Linguistic Learning Levels of Primary School Students

    ERIC Educational Resources Information Center

    del Moral Pérez, M. Esther; Duque, Alba P. Guzmán; García, L. Carlota Fernández

    2018-01-01

    Game-based learning is an innovative methodology that takes advantage of the educational potential offered by videogames in general and serious games in particular to boost training processes, thus making it easier for users to achieve motivated learning. The present paper focuses on the description of the Game to Learn Project, which has as its…

  12. Army Training Study: Battalion Training Survey. Volumes 1 and 2.

    DTIC Science & Technology

    1978-08-08

    mathematical logic in the methodology. II. MAGN ITUJDE-ESTI MAT ION SCALLING A. General Description A unique methodology, Magnitude-Estimation...to 142.) I b " p .’ . -, / 1 ’- " ’. " " . -’ -" ..’- ’ ;’ ’- . "’ .- ’,, • "." -- -. -. -.-. The base conditio (represen.d in T1- sIA , IIA, and IIIA

  13. Objectives Stated for the Use of Literature at School: An Empirical Analysis, Part I.

    ERIC Educational Resources Information Center

    Klingberg, Gote; Agren, Bengt

    This report presents a theoretical basis for literary education through goal analyses. The object of the analyses is to obtain clearer formulations of the subgoals of instruction with the help of literature, and to arrange them in logical sequence. Using 79 sources from 12 countries, an empirical study was made, and goal descriptions were…

  14. Spatial language and converseness.

    PubMed

    Burigo, Michele; Coventry, Kenny R; Cangelosi, Angelo; Lynott, Dermot

    2016-12-01

    Typical spatial language sentences consist of describing the location of an object (the located object) in relation to another object (the reference object) as in "The book is above the vase". While it has been suggested that the properties of the located object (the book) are not translated into language because they are irrelevant when exchanging location information, it has been shown that the orientation of the located object affects the production and comprehension of spatial descriptions. In line with the claim that spatial language apprehension involves inferences about relations that hold between objects it has been suggested that during spatial language apprehension people use the orientation of the located object to evaluate whether the logical property of converseness (e.g., if "the book is above the vase" is true, then also "the vase is below the book" must be true) holds across the objects' spatial relation. In three experiments using sentence acceptability rating tasks we tested this hypothesis and demonstrated that when converseness is violated people's acceptability ratings of a scene's description are reduced indicating that people do take into account geometric properties of the located object and use it to infer logical spatial relations.

  15. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    NASA Astrophysics Data System (ADS)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.

  16. Application Development for Optimizing Patient Placement on Aeromedical Evacuation Flights: Proof-of-Concept

    DTIC Science & Technology

    2018-01-12

    outcomes. This study included three phases: knowledge elicitation, establishment of rule-based, logic requirements, and the development of the POC iOS ...establish the logic needed for a mobile app prior to programming for iOS platforms. The study team selected Microsoft Excel because it enabled the...distribution of these plans would streamline the plan development process. Thus, as a proof-of-concept, the study team conducted a multi-phased effort

  17. Personalisation - An Emergent Institutional Logic in Healthcare? Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Ferlie, Ewan

    2017-06-20

    This commentary on the recent think piece by Mannion and Exworthy reviews their core arguments, highlighting their suggestion that recent forces for personalization have emerged which may counterbalance the strong standardization wave which has been evident in many healthcare settings and systems over the last two decades. These forces for personalization can take very different forms. The commentary explores the authors' suggestion that these themes can be fruitfully examined theoretically through an institutional logics (ILs) literature, which has recently been applied by some scholars to healthcare settings. This commentary outlines key premises of that theoretical tradition. Finally, the commentary makes suggestions for taking this IL influenced research agenda further, along with some issues to be addressed. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  18. DataHub knowledge based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.

    1991-01-01

    Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.

  19. Harmonising Nursing Terminologies Using a Conceptual Framework.

    PubMed

    Jansen, Kay; Kim, Tae Youn; Coenen, Amy; Saba, Virginia; Hardiker, Nicholas

    2016-01-01

    The International Classification for Nursing Practice (ICNP®) and the Clinical Care Classification (CCC) System are standardised nursing terminologies that identify discrete elements of nursing practice, including nursing diagnoses, interventions, and outcomes. While CCC uses a conceptual framework or model with 21 Care Components to classify these elements, ICNP, built on a formal Web Ontology Language (OWL) description logic foundation, uses a logical hierarchical framework that is useful for computing and maintenance of ICNP. Since the logical framework of ICNP may not always align with the needs of nursing practice, an informal framework may be a more useful organisational tool to represent nursing content. The purpose of this study was to classify ICNP nursing diagnoses using the 21 Care Components of the CCC as a conceptual framework to facilitate usability and inter-operability of nursing diagnoses in electronic health records. Findings resulted in all 521 ICNP diagnoses being assigned to one of the 21 CCC Care Components. Further research is needed to validate the resulting product of this study with practitioners and develop recommendations for improvement of both terminologies.

  20. A will to youth: the woman's anti-aging elixir.

    PubMed

    Smirnova, Michelle Hannah

    2012-10-01

    The logic and cultural myths that buttress the cosmeceutical industry construct the older woman as a victim of old age, part of an "at-risk" population who must monitor, treat and prevent any markers of old age. A content and discourse analysis of 124 advertisements from the US More magazine between 1998 and 2008, revealed three major themes working together to produce this civic duty: (1) the inclusion of scientific and medical authorities in order to define the cosmeceutical as a 'drug' curing a disease, (2) descriptions of the similarities (and differences) between the abilities of cosmeceuticals and cosmetic surgery to restore one's youth, and (3) the logic equating youth with beauty, femininity and power and older age with the absence of these qualities. Together these intersecting logics produce the "will to youth"-the imperative of the aging woman to promote her youthful appearance by any and all available means. Further, by using images and references to fantasies and traditional fairytales, cosmeceutical advertisements both promise and normalize expectations of eternal youth of the aging woman. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  2. Inseparability of science history and discovery

    NASA Astrophysics Data System (ADS)

    Herndon, J. M.

    2010-04-01

    Science is very much a logical progression through time. Progressing along a logical path of discovery is rather like following a path through the wilderness. Occasionally the path splits, presenting a choice; the correct logical interpretation leads to further progress, the wrong choice leads to confusion. By considering deeply the relevant science history, one might begin to recognize past faltering in the logical progression of observations and ideas and, perhaps then, to discover new, more precise understanding. The following specific examples of science faltering are described from a historical perspective: (1) Composition of the Earth's inner core; (2) Giant planet internal energy production; (3) Physical impossibility of Earth-core convection and Earth-mantle convection, and; (4) Thermonuclear ignition of stars. For each example, a revised logical progression is described, leading, respectively, to: (1) Understanding the endo-Earth's composition; (2) The concept of nuclear georeactor origin of geo- and planetary magnetic fields; (3) The invalidation and replacement of plate tectonics; and, (4) Understanding the basis for the observed distribution of luminous stars in galaxies. These revised logical progressions clearly show the inseparability of science history and discovery. A different and more fundamental approach to making scientific discoveries than the frequently discussed variants of the scientific method is this: An individual ponders and through tedious efforts arranges seemingly unrelated observations into a logical sequence in the mind so that causal relationships become evident and new understanding emerges, showing the path for new observations, for new experiments, for new theoretical considerations, and for new discoveries. Science history is rich in "seemingly unrelated observations" just waiting to be logically and causally related to reveal new discoveries.

  3. The shuttle main engine: A first look

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1996-01-01

    Anyone entering the Space Shuttle Main Engine (SSME) team attends a two week course to become familiar with the design and workings of the engine. This course provides intensive coverage of the individual hardware items and their functions. Some individuals, particularly those involved with software maintenance and development, have felt overwhelmed by this volume of material and their lack of a logical framework in which to place it. To provide this logical framework, it was decided that a brief self-taught introduction to the overall operation of the SSME should be designed. To aid the people or new team members with an interest in the software, this new course should also explain the structure and functioning of the controller and its software. This paper presents a description of this presentation.

  4. Heliocentric interplanetary low thrust trajectory optimization program, supplement 1, part 2

    NASA Technical Reports Server (NTRS)

    Mann, F. I.; Horsewood, J. L.

    1978-01-01

    The improvements made to the HILTOP electric propulsion trajectory computer program are described. A more realistic propulsion system model was implemented in which various thrust subsystem efficiencies and specific impulse are modeled as variable functions of power available to the propulsion system. The number of operating thrusters are staged, and the beam voltage is selected from a set of five (or less) constant voltages, based upon the application of variational calculus. The constant beam voltages may be optimized individually or collectively. The propulsion system logic is activated by a single program input key in such a manner as to preserve the HILTOP logic. An analysis describing these features, a complete description of program input quantities, and sample cases of computer output illustrating the program capabilities are presented.

  5. Shuttle operations simulation model programmers'/users' manual

    NASA Technical Reports Server (NTRS)

    Porter, D. G.

    1972-01-01

    The prospective user of the shuttle operations simulation (SOS) model is given sufficient information to enable him to perform simulation studies of the space shuttle launch-to-launch operations cycle. The procedures used for modifying the SOS model to meet user requirements are described. The various control card sequences required to execute the SOS model are given. The report is written for users with varying computer simulation experience. A description of the components of the SOS model is included that presents both an explanation of the logic involved in the simulation of the shuttle operations cycle and a description of the routines used to support the actual simulation.

  6. Specification and verification of gate-level VHDL models of synchronous and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    Russinoff, David M.

    1995-01-01

    We present a mathematical definition of hardware description language (HDL) that admits a semantics-preserving translation to a subset of VHDL. Our HDL includes the basic VHDL propagation delay mechanisms and gate-level circuit descriptions. We also develop formal procedures for deriving and verifying concise behavioral specifications of combinational and sequential devices. The HDL and the specification procedures have been formally encoded in the computational logic of Boyer and Moore, which provides a LISP implementation as well as a facility for mechanical proof-checking. As an application, we design, specify, and verify a circuit that achieves asynchronous communication by means of the biphase mark protocol.

  7. FASTER 3: A generalized-geometry Monte Carlo computer program for the transport of neutrons and gamma rays. Volume 2: Users manual

    NASA Technical Reports Server (NTRS)

    Jordan, T. M.

    1970-01-01

    A description of the FASTER-III program for Monte Carlo Carlo calculation of photon and neutron transport in complex geometries is presented. Major revisions include the capability of calculating minimum weight shield configurations for primary and secondary radiation and optimal importance sampling parameters. The program description includes a users manual describing the preparation of input data cards, the printout from a sample problem including the data card images, definitions of Fortran variables, the program logic, and the control cards required to run on the IBM 7094, IBM 360, UNIVAC 1108 and CDC 6600 computers.

  8. Small Interactive Image Processing System (SMIPS) system description

    NASA Technical Reports Server (NTRS)

    Moik, J. G.

    1973-01-01

    The Small Interactive Image Processing System (SMIPS) operates under control of the IBM-OS/MVT operating system and uses an IBM-2250 model 1 display unit as interactive graphic device. The input language in the form of character strings or attentions from keys and light pen is interpreted and causes processing of built-in image processing functions as well as execution of a variable number of application programs kept on a private disk file. A description of design considerations is given and characteristics, structure and logic flow of SMIPS are summarized. Data management and graphic programming techniques used for the interactive manipulation and display of digital pictures are also discussed.

  9. Flight Experiment Demonstration System (FEDS) functional description and interface document

    NASA Technical Reports Server (NTRS)

    Belcher, R. C.; Shank, D. E.

    1984-01-01

    This document presents a functional description of the Flight Experiment Demonstration System (FEDS) and of interfaces between FEDS and external hardware and software. FEDS is a modification of the Automated Orbit Determination System (AODS). FEDS has been developed to support a ground demonstration of microprocessor-based onboard orbit determination. This document provides an overview of the structure and logic of FEDS and details the various operational procedures to build and execute FEDS. It also documents a microprocessor interface between FEDS and a TDRSS user transponder and describes a software simulator of the interface used in the development and system testing of FEDS.

  10. The Integration of DCS I/O to an Existing PLC

    NASA Technical Reports Server (NTRS)

    Sadhukhan, Debashis; Mihevic, John

    2013-01-01

    At the NASA Glenn Research Center (GRC), Existing Programmable Logic Controller (PLC) I/O was replaced with Distributed Control System (DCS) I/O, while keeping the existing PLC sequence Logic. The reason for integration of the PLC logic and DCS I/O, along with the evaluation of the resulting system is the subject of this paper. The pros and cons of the old system and new upgrade are described, including operator workstation screen update times. Detail of the physical layout and the communication between the PLC, the DCS I/O and the operator workstations are illustrated. The complex characteristics of a central process control system and the plan to remove the PLC processors in future upgrades is also discussed.

  11. Abstractions for Fault-Tolerant Distributed System Verification

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  12. Quantum Electronic Solids

    DTIC Science & Technology

    2012-03-07

    signal processing with smaller sizes and unique properties Nanoelectronics: NTs, graphene, diamond, SiC for sensing, logic & memory storage 3...synthesized i-n graphene heterojunctions 19 DISTRIBUTION A: Approved for public release; distribution is unlimited. Electrical Properties of...boundaries in polycrystalline samples Polycrystalline graphene can have similar (as much as 90%) electrical properties (conductance and mobility

  13. Network Computing for Distributed Underwater Acoustic Sensors

    DTIC Science & Technology

    2014-03-31

    underwater sensor network with mobility. In preparation. [3] EvoLogics (2013), Underwater Acoustic Modems, (Product Information Guide... Wireless Communications, 9(9), 2934–2944. [21] Pompili, D. and Akyildiz, I. (2010), A multimedia cross-layer protocol for underwater acoustic sensor networks ... Network Computing for Distributed Underwater Acoustic Sensors M. Barbeau E. Kranakis

  14. A methodology to design heuristics for model selection based on the characteristics of data: Application to investigate when the Negative Binomial Lindley (NB-L) is preferred over the Negative Binomial (NB).

    PubMed

    Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy

    2017-10-01

    Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications

    DTIC Science & Technology

    1992-09-01

    STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach

  16. A Logical Design of a Session Services Control Layer of a Distributed Network Architecture for SPLICE (Stock Point Logistics Integrated Communication Environment).

    DTIC Science & Technology

    1984-06-01

    Eacn stock point is autonomous witn respect to how it implements data processing support, as long as it accommodates the Navy Supply Systems Command...has its own data elements, files, programs , transactions, users, reports, and some have additional hardware. To augment them all and not force redesign... programs are written to request session establishments among them using only logical addressing names (mailboxes) whicn are independent from physical

  17. Convolutional Neural Network on Embedded Linux(trademark) System-on-Chip: A Methodology and Performance Benchmark

    DTIC Science & Technology

    2016-05-01

    A9 CPU and 15 W for the i7 CPU. A method of accelerating this computation is by using a customized hardware unit called a field- programmable gate...implementation of custom logic to accelerate com- putational workloads. This FPGA fabric, in addition to the standard programmable logic, contains 220...chip; field- programmable gate array Daniel Gebhardt U U U U 18 (619) 553-2786 INITIAL DISTRIBUTION 84300 Library (2) 85300 Archive/Stock (1

  18. Flexible programmable logic module

    DOEpatents

    Robertson, Perry J.; Hutchinson, Robert L.; Pierson, Lyndon G.

    2001-01-01

    The circuit module of this invention is a VME board containing a plurality of programmable logic devices (PLDs), a controlled impedance clock tree, and interconnecting buses. The PLDs are arranged to permit systolic processing of a problem by offering wide data buses and a plurality of processing nodes. The board contains a clock reference and clock distribution tree that can drive each of the PLDs with two critically timed clock references. External clock references can be used to drive additional circuit modules all operating from the same synchronous clock reference.

  19. Convolutional Neural Network on Embedded Linux System-on-Chip: A Methodology and Performance Benchmark

    DTIC Science & Technology

    2016-05-01

    A9 CPU and 15 W for the i7 CPU. A method of accelerating this computation is by using a customized hardware unit called a field- programmable gate...implementation of custom logic to accelerate com- putational workloads. This FPGA fabric, in addition to the standard programmable logic, contains 220...chip; field- programmable gate array Daniel Gebhardt U U U U 18 (619) 553-2786 INITIAL DISTRIBUTION 84300 Library (2) 85300 Archive/Stock (1

  20. Talking Past Each Other? Cultural Framing of Skeptical and Convinced Logics in the Climate Change Debate

    NASA Astrophysics Data System (ADS)

    Hoffman, A.

    2011-12-01

    This paper analyzes the extent to which two institutional logics around climate change - the climate change "convinced" and climate change "skeptical" logics - are truly competing or talking past each other in a way that can be described as a logic schism. Drawing on the concept of framing from social movement theory, it uses qualitative field observations from the largest climate deniers conference in the U.S. and a dataset of almost 800 op/eds from major news outlets over a two year period to examine how convinced and skeptical logics employ frames and issue categories to make arguments about climate change. This paper finds that the two logics are engaging in different debates on similar issues with the former focusing on solutions while the latter debates the definition of the problem. It concludes that the debate appears to be reaching a level of polarization where one might begin to question whether meaningful dialogue and problem-solving has become unavailable to participants. The implications of such a logic schism is a shift from an integrative debate focused on addressing interests to a distributive battle over concessionary agreements with each side pursuing its goals by demonizing the other. Avoiding such an outcome requires the activation of, as yet, dormant "broker" frames (technology, religion and national security), the redefinition of existing ones (science, economics, risk, ideology) and the engagement of effective "brokers" to deliver them.

  1. Answering Questions about Complex Events

    DTIC Science & Technology

    2008-12-19

    in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about

  2. Cosmic logic: a computational model

    NASA Astrophysics Data System (ADS)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  3. Integrated-Circuit Pseudorandom-Number Generator

    NASA Technical Reports Server (NTRS)

    Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur

    1992-01-01

    Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.

  4. Test functions for three-dimensional control-volume mixed finite-element methods on irregular grids

    USGS Publications Warehouse

    Naff, R.L.; Russell, T.F.; Wilson, J.D.; ,; ,; ,; ,; ,

    2000-01-01

    Numerical methods based on unstructured grids, with irregular cells, usually require discrete shape functions to approximate the distribution of quantities across cells. For control-volume mixed finite-element methods, vector shape functions are used to approximate the distribution of velocities across cells and vector test functions are used to minimize the error associated with the numerical approximation scheme. For a logically cubic mesh, the lowest-order shape functions are chosen in a natural way to conserve intercell fluxes that vary linearly in logical space. Vector test functions, while somewhat restricted by the mapping into the logical reference cube, admit a wider class of possibilities. Ideally, an error minimization procedure to select the test function from an acceptable class of candidates would be the best procedure. Lacking such a procedure, we first investigate the effect of possible test functions on the pressure distribution over the control volume; specifically, we look for test functions that allow for the elimination of intermediate pressures on cell faces. From these results, we select three forms for the test function for use in a control-volume mixed method code and subject them to an error analysis for different forms of grid irregularity; errors are reported in terms of the discrete L2 norm of the velocity error. Of these three forms, one appears to produce optimal results for most forms of grid irregularity.

  5. Logical-rule models of classification response times: a synthesis of mental-architecture, random-walk, and decision-bound approaches.

    PubMed

    Fific, Mario; Little, Daniel R; Nosofsky, Robert M

    2010-04-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli along a set of component dimensions. Those independent decisions are then combined via logical rules to determine the overall categorization response. The time course of the independent decisions is modeled via random-walk processes operating along individual dimensions. Alternative mental architectures are used as mechanisms for combining the independent decisions to implement the logical rules. We derive fundamental qualitative contrasts for distinguishing among the predictions of the rule models and major alternative models of classification RT. We also use the models to predict detailed RT-distribution data associated with individual stimuli in tasks of speeded perceptual classification. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  6. MATLAB Simulation of UPQC for Power Quality Mitigation Using an Ant Colony Based Fuzzy Control Technique

    PubMed Central

    Kumarasabapathy, N.; Manoharan, P. S.

    2015-01-01

    This paper proposes a fuzzy logic based new control scheme for the Unified Power Quality Conditioner (UPQC) for minimizing the voltage sag and total harmonic distortion in the distribution system consequently to improve the power quality. UPQC is a recent power electronic module which guarantees better power quality mitigation as it has both series-active and shunt-active power filters (APFs). The fuzzy logic controller has recently attracted a great deal of attention and possesses conceptually the quality of the simplicity by tackling complex systems with vagueness and ambiguity. In this research, the fuzzy logic controller is utilized for the generation of reference signal controlling the UPQC. To enable this, a systematic approach for creating the fuzzy membership functions is carried out by using an ant colony optimization technique for optimal fuzzy logic control. An exhaustive simulation study using the MATLAB/Simulink is carried out to investigate and demonstrate the performance of the proposed fuzzy logic controller and the simulation results are compared with the PI controller in terms of its performance in improving the power quality by minimizing the voltage sag and total harmonic distortion. PMID:26504895

  7. All-optical logic gates and wavelength conversion via the injection locking of a Fabry-Perot semiconductor laser

    NASA Astrophysics Data System (ADS)

    Harvey, E.; Pochet, M.; Schmidt, J.; Locke, T.; Naderi, N.; Usechak, N. G.

    2013-03-01

    This work investigates the implementation of all-optical logic gates based on optical injection locking (OIL). All-optical inverting, NOR, and NAND gates are experimentally demonstrated using two distributed feedback (DFB) lasers, a multi-mode Fabry-Perot laser diode, and an optical band-pass filter. The DFB lasers are externally modulated to represent logic inputs into the cavity of the multi-mode Fabry-Perot slave laser. The input DFB (master) lasers' wavelengths are aligned with the longitudinal modes of the Fabry-Perot slave laser and their optical power is used to modulate the injection conditions in the Fabry-Perot slave laser. The optical band-pass filter is used to select a Fabry- Perot mode that is either suppressed or transmitted given the logic state of the injecting master laser signals. When the input signal(s) is (are) in the on state, injection locking, and thus the suppression of the non-injected Fabry-Perot modes, is induced, yielding a dynamic system that can be used to implement photonic logic functions. Additionally, all-optical photonic processing is achieved using the cavity-mode shift produced in the injected slave laser under external optical injection. The inverting logic case can also be used as a wavelength converter — a key component in advanced wavelength-division multiplexing networks. As a result of this experimental investigation, a more comprehensive understanding of the locking parameters involved in injecting multiple lasers into a multi-mode cavity and the logic transition time is achieved. The performance of optical logic computations and wavelength conversion has the potential for ultrafast operation, limited primarily by the photon decay rate in the slave laser.

  8. A Survey of Some Approaches to Distributed Data Base & Distributed File System Architecture.

    DTIC Science & Technology

    1980-01-01

    BUS POD A DD A 12 12 A = A Cell D = D Cell Figure 7-1: MUFFIN logical architecture - 45 - MUFI January 1980 ".-.Bus Interface V Conventional Processor...and Applied Mathematics (14), * December, 1966. [Kimbleton 791 Kimbleton, Stephen; Wang, Pearl; and Fong, Elizabeth. XNDM: An Experimental Network

  9. Long Term Change in Personal Income Distribution: Theoretical Approaches, Evidence and Explanations.

    ERIC Educational Resources Information Center

    Schultz, T. Paul

    The paper discusses various models and theories of personal income distribution inequality. The first section presents the logic for adopting one conceptual and statistical approach in measuring and analyzing income inequality and the second presents empirical evidence on income inequality from 1939 to 1970. A brief survey of the human capital…

  10. Fragments of Science: Festschrift for Mendel Sachs

    NASA Astrophysics Data System (ADS)

    Ram, Michael

    1999-11-01

    The Table of Contents for the full book PDF is as follows: * Preface * Sketches at a Symposium * For Mendel Sachs * The Constancy of an Angular Point of View * Information-Theoretic Logic and Transformation-Theoretic Logic * The Invention of the Transistor and the Realization of the Hole * Mach's Principle, Newtonian Gravitation, Absolute Space, and Einstein * The Sun, Our Variable Star * The Inconstant Sun: Symbiosis of Time Variations of Sunspots, Atmospheric Radiocarbon, Aurorae, and Tree Ring Growth * Other Worlds * Super-Classical Quantum Mechanics * A Probabilistic Approach to the Phase Problem of X-Ray Crystallography * A Nonlinear Twist on Inertia Gives Unified Electroweak Gravitation * Neutrino Oscillations * On an Incompleteness in the General-Relativistic Description of Gravitation * All Truth is One * Ideas of Physics: Correspondence between Colleagues * The Influence of the Physics and Philosophy of Einstein's Relativity on My Attitudes in Science: An Autobiography

  11. ac propulsion system for an electric vehicle

    NASA Technical Reports Server (NTRS)

    Geppert, S.

    1980-01-01

    It is pointed out that dc drives will be the logical choice for current production electric vehicles (EV). However, by the mid-80's, there is a good chance that the price and reliability of suitable high-power semiconductors will allow for a competitive ac system. The driving force behind the ac approach is the induction motor, which has specific advantages relative to a dc shunt or series traction motor. These advantages would be an important factor in the case of a vehicle for which low maintenance characteristics are of primary importance. A description of an EV ac propulsion system is provided, taking into account the logic controller, the inverter, the motor, and a two-speed transmission-differential-axle assembly. The main barrier to the employment of the considered propulsion system in EV is not any technical problem, but inverter transistor cost.

  12. Knowledge bases built on web languages from the point of view of predicate logics

    NASA Astrophysics Data System (ADS)

    Vajgl, Marek; Lukasová, Alena; Žáček, Martin

    2017-06-01

    The article undergoes evaluation of formal systems created on the base of web (ontology/concept) languages by simplifying the usual approach of knowledge representation within the FOPL, but sharing its expressiveness, semantic correct-ness, completeness and decidability. Evaluation of two of them - that one based on description logic and that one built on RDF model principles - identifies some of the lacks of those formal systems and presents, if possible, corrections of them. Possibilities to build an inference system capable to obtain new further knowledge over given knowledge bases including those describing domains by giant linked domain databases has been taken into account. Moreover, the directions towards simplifying FOPL language discussed here has been evaluated from the point of view of a possibility to become a web language for fulfilling an idea of semantic web.

  13. B-Plant Canyon Ventilation Control System Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCDANIEL, K.S.

    1999-08-31

    Project W-059 installed a new B Plant Canyon Ventilation System. Monitoring and control of the system is implemented by the Canyon Ventilation Control System (CVCS). This document describes the CVCS system components which include a Programmable Logic Controller (PLC) coupled with an Operator Interface Unit (OIU) and application software. This document also includes an Alarm Index specifying the setpoints and technical basis for system analog and digital alarms.

  14. Fault-Tolerant Sequencer Using FPGA-Based Logic Designs for Space Applications

    DTIC Science & Technology

    2013-12-01

    Prototype Board SBU single bit upset SDK software development kit SDRAM synchronous dynamic random-access memory SEB single-event burnout ...current VHDL VHSIC hardware description language VHSIC very-high-speed integrated circuits VLSI very-large- scale integration VQFP very...transient pulse, called a single-event transient (SET), or even cause permanent damage to the device in the form of a burnout or gate rupture. The SEE

  15. Phoenix: Service Oriented Architecture for Information Management - Abstract Architecture Document

    DTIC Science & Technology

    2011-09-01

    implementation logic and policy if and which Information Brokering and Repository Services the information is going to be forwarded to. These service chains...descriptions are going to be retrieved. Raised Exceptions: • Exception getConsumers(sessionTrack : SessionTrack, information : Information...that exetnd the usefullness of the IM system as a whole. • Client • Event Notification • Filter • Information Discovery • Security • Service

  16. Assessing the Potential Value of Semantic Web Technologies in Support of Military Operations

    DTIC Science & Technology

    2003-09-01

    Teleconference). Deitel , P. J. (2002). Java, How to Program , Fourth Edition. Upper Saddle River, New Jersey: Prentice-Hall, Inc. Description Logics... how clients connect with each other to form an impromptu community. Jini™ lets programs use services in a network without knowing anything about the...another runtime program (execution engine) to determine how the computer should do it. Declarative programming is very different from the traditional

  17. Cross Polarization Interference Reduction Techniques

    DTIC Science & Technology

    1979-06-01

    Description of the Conceptual Design ................... 2-3 2.1.2.1 Performance Measurement Circuitry ............ . ....... 2-6 2.1.2.2 Control Logic...for publication. APPROVED: FRE"DERIC1K D. SCWTANDT Prqjec1 Engineer APPROVED: Technical Director Comunicatiorna and Control Division Th2R TUHE C...Griffiss AFB NY 13441 UBHOPAE 14. MONITORING AGENCY NArJE & ADDRESS(II diferent from Controling office) IS. SECUR!TY CLASS. (.1 this roport) Same

  18. Transformational derivation of programs using the Focus system

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.

    1988-01-01

    A program derivation support system called Focus is being constructed. It will formally derive programs using the paradigm of program transformation. The following issues are discussed: (1) the integration of validation and program derivation activities in the Focus system; (2) its tree-based user interface; (3) the control of search spaces in program derivation; and (4) the structure and organization of program derivation records. The inference procedures of the system are based on the integration of functional and logic programming principles. This brings about a synthesis of paradigms that were heretofore considered far apart, such as logical and executable specifications and constructive and transformational approaches to program derivation. A great emphasis has been placed, in the design of Focus, on achieving small search spaces during program derivation. The program manipulation operations such as expansion, simplification and rewriting were designed with this objective. The role of operations that are expensive in search spaces, such as folding, has been reduced. Program derivations are documented in Focus in a way that the high level descriptions of derivations are expressed only using program level information. All the meta-level information, together with dependencies between derivations of program components, is automatically recorded by the system at a lower level of description for its own use in replay.

  19. Qualitative spatial logic descriptors from 3D indoor scenes to generate explanations in natural language.

    PubMed

    Falomir, Zoe; Kluth, Thomas

    2017-06-24

    The challenge of describing 3D real scenes is tackled in this paper using qualitative spatial descriptors. A key point to study is which qualitative descriptors to use and how these qualitative descriptors must be organized to produce a suitable cognitive explanation. In order to find answers, a survey test was carried out with human participants which openly described a scene containing some pieces of furniture. The data obtained in this survey are analysed, and taking this into account, the QSn3D computational approach was developed which uses a XBox 360 Kinect to obtain 3D data from a real indoor scene. Object features are computed on these 3D data to identify objects in indoor scenes. The object orientation is computed, and qualitative spatial relations between the objects are extracted. These qualitative spatial relations are the input to a grammar which applies saliency rules obtained from the survey study and generates cognitive natural language descriptions of scenes. Moreover, these qualitative descriptors can be expressed as first-order logical facts in Prolog for further reasoning. Finally, a validation study is carried out to test whether the descriptions provided by QSn3D approach are human readable. The obtained results show that their acceptability is higher than 82%.

  20. The logic of relations and the logic of management.

    PubMed

    Buntinx, W

    2008-07-01

    Increasing emphasis on financial and administrative control processes is affecting service culture in support organisations for persons with intellectual disability. This phenomenon is currently obvious in Dutch service organisations that find themselves in transition towards more community care and at the same time under pressure from new administrative and funding managerial bureaucracy. As a result, the logic of management is becoming more dominant in direct support settings and risk to overshadow the logic of relationships between staff and clients. The article presents a reflection on this phenomenon, starting from a description of service team characteristics as found in the literature. Next, findings about direct support staff (DSS) continuity are summarised from four Dutch studies. Following up these findings, the concept of 'microsystems' is explored as a possible answer to the organisational challenges demonstrated in the studies. Team characteristics, especially team size and membership continuity for DSS, appear relevant factors for assuring supportive relationships and service quality in direct support teams. The structure of the primary support team shows to be of special interest. The organisational concept of 'microsystems' is explored with respect to transcending the present conflict between bureaucratic managerial pressure and the need for supportive relationships. Service organisations need to create structural conditions for the efficacy of direct support teams in terms of client relationships and relevant client outcomes. At the same time, the need for administrative and control processes can not be denied. The concept of 'microsystems', application of a Quality of Life framework and the use of new instruments, such as the Supports Intensity Scale, can contribute to an organisational solution for the present conflicting logic of relations and management.

  1. Semantic Modelling of Digital Forensic Evidence

    NASA Astrophysics Data System (ADS)

    Kahvedžić, Damir; Kechadi, Tahar

    The reporting of digital investigation results are traditionally carried out in prose and in a large investigation may require successive communication of findings between different parties. Popular forensic suites aid in the reporting process by storing provenance and positional data but do not automatically encode why the evidence is considered important. In this paper we introduce an evidence management methodology to encode the semantic information of evidence. A structured vocabulary of terms, ontology, is used to model the results in a logical and predefined manner. The descriptions are application independent and automatically organised. The encoded descriptions aim to help the investigation in the task of report writing and evidence communication and can be used in addition to existing evidence management techniques.

  2. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  3. Scalable global grid catalogue for Run3 and beyond

    NASA Astrophysics Data System (ADS)

    Martinez Pedreira, M.; Grigoras, C.; ALICE Collaboration

    2017-10-01

    The AliEn (ALICE Environment) file catalogue is a global unique namespace providing mapping between a UNIX-like logical name structure and the corresponding physical files distributed over 80 storage elements worldwide. Powerful search tools and hierarchical metadata information are integral parts of the system and are used by the Grid jobs as well as local users to store and access all files on the Grid storage elements. The catalogue has been in production since 2005 and over the past 11 years has grown to more than 2 billion logical file names. The backend is a set of distributed relational databases, ensuring smooth growth and fast access. Due to the anticipated fast future growth, we are looking for ways to enhance the performance and scalability by simplifying the catalogue schema while keeping the functionality intact. We investigated different backend solutions, such as distributed key value stores, as replacement for the relational database. This contribution covers the architectural changes in the system, together with the technology evaluation, benchmark results and conclusions.

  4. An Integrated Specification and Verification Environment for Component-Based Architectures of Large-Scale Distributed Systems

    DTIC Science & Technology

    2009-05-26

    Interrupt HW Interrupt DFS Dynamic Frequency Selection TPC Transmit Power Control r- MPX Hub ! i j I Power Supply Init/Reset A/D...values of several variables: from IN_0_DAT when the Mailbox forwards data supplied by Client 0, from OUT1DAT when the conditions on the ready flags are...logically implies ip y 0, and also <j> logically implies ip y <p; (b) if for all i we have that cj>t+1 is of the form (p% y ip, then the chain

  5. Logical account of a terminological tool

    NASA Astrophysics Data System (ADS)

    Bresciani, Paolo

    1991-03-01

    YAK (Yet Another Krapfen) is a hybrid knowledge representation environment following the tradition of KL-ONE and KRYPTON. In its terminological box (TBOX) concepts and roles are described by means of a language called KFL that, even if inspired to FL-, captures a different set of descriptions, especially allowing the formulation of structured roles, that are aimed to be more adequate for the representational goal. KFL results to have a valid and complete calculus for the notion of subsumption, and a tractable algorithm that realize it is available. In the present paper it is shown how the semantics of a sufficiently significative subset of KFL can be described in terms of standard first order logic semantics. It is so possible to develop a notion of 'relation' between KFL and a full first order logic language formulated ad hoc and usable as formalization of an assertional box (ABOX). We then use this notion to justify the use of the terminological classification algorithm of the TBOX of YAK as part of the deduction machinery of the ABOX. In this way, the whole hybrid environment can take real and consistent advantages both from the TBOX and its tractable classification algorithm, and from the ABOX, and its wider expressive capability.

  6. Do the Particles of an Ideal Gas Collide?

    ERIC Educational Resources Information Center

    Lesk, Arthur M.

    1974-01-01

    Describes the collisional properties as a logically essential component of the ideal gas model since an actual intraparticle process cannot support observable anisotropic velocity distributions without collisions taken into account. (CC)

  7. Extending XNAT Platform with an Incremental Semantic Framework

    PubMed Central

    Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael

    2017-01-01

    Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709

  8. Extending XNAT Platform with an Incremental Semantic Framework.

    PubMed

    Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael

    2017-01-01

    Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.

  9. Analytical tools in accelerator physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky andmore » A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.« less

  10. Parallel Logic Programming Architecture

    DTIC Science & Technology

    1990-04-01

    Section 3.1. 3.1. A STATIC ALLOCATION SCHEME (SAS) Methods that have been used for decomposing distributed problems in artificial intelligence...multiple agents, knowledge organization and allocation, and cooperative parallel execution. These difficulties are common to distributed artificial ...for the following reasons. First, intellegent backtracking requires much more bookkeeping and is therefore more costly during consult-time and during

  11. The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory

    NASA Astrophysics Data System (ADS)

    Frey, Kimberly

    The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.

  12. "This Is How We Work Here": Informal Logic and Social Order in Primary Health Care Services in Mexico City.

    PubMed

    Saavedra, Nayelhi Itandehui; Berenzon, Shoshana; Galván, Jorge

    2017-07-01

    People who work in health care facilities participate in a shared set of tacit agreements, attitudes, habits, and behaviors that contribute to the functioning of those institutions, but that can also cause conflict. This phenomenon has been addressed tangentially in the study of bureaucratic practices in governmental agencies, but it has not been carefully explored in the specific context of public health care centers. To this end, we analyzed a series of encounters among staff and patients, as well as the situations surrounding the services offered, in public primary care health centers in Mexico City, based on Erving Goffman's concepts of social order, encounter, and situation, and on the concepts of formal and informal logic. In a descriptive study over the course of 2 years, we carried out systematic observations in 19 health centers and conducted interviews with medical, technical, and administrative staff, and psychologists, social workers, and patients. We recorded these observations in field notes and performed reflexive analysis with readings on three different levels. Interviews were recorded, transcribed, and analyzed through identification of thematic categories and subcategories. Information related to encounters and situations from field notes and interviews was selected to triangulate the materials. We found the social order prevailing among staff to be based on a combination of status markers, such as educational level, seniority, and employee versus contractor status, which define the distribution of workloads, material resources, and space. Although this system generates conflicts, it also contributes to the smooth functioning of the health centers. The daily encounters and situations in all of these health centers allow for a set of informal practices that provide a temporary resolution of the contradictions posed by the institution for its workers.

  13. Logical Design of a Decision Support System to Forecast Technology, Prices and Costs for the National Communications System.

    DTIC Science & Technology

    1984-09-01

    IN SOFTWARE DESIGN ......... .................... 39 P. PROCESS DESCRIPTIONS 43.............3 1. Model Euilding .............. 43 2. M1odel Management ... manager to model a wide variety of technology, price and cost situations without the associated overhead imposed by multiple application-specific systems...The Manager of the National Communications System (NCS) has been tasked by the National Security Telecommunications Policy of 3 August 1983 with

  14. Detailed description of the HP-9825A HFRMP trajectory processor (TRAJ)

    NASA Technical Reports Server (NTRS)

    Kindall, S. M.; Wilson, S. W.

    1979-01-01

    The computer code for the trajectory processor of the HP-9825A High Fidelity Relative Motion Program is described in detail. The processor is a 12-degrees-of-freedom trajectory integrator which can be used to generate digital and graphical data describing the relative motion of the Space Shuttle Orbiter and a free-flying cylindrical payload. Coding standards and flow charts are given and the computational logic is discussed.

  15. State criminal justice telecommunications (STACOM). Volume 4: Network design software user's guide

    NASA Technical Reports Server (NTRS)

    Lee, J. J.

    1977-01-01

    A user's guide to the network design program is presented. The program is written in FORTRAN V and implemented on a UNIVAC 1108 computer under the EXEC-8 operating system which enables the user to construct least-cost network topologies for criminal justice digital telecommunications networks. A complete description of program features, inputs, processing logic, and outputs is presented, and a sample run and a program listing are included.

  16. Multi-variants synthesis of Petri nets for FPGA devices

    NASA Astrophysics Data System (ADS)

    Bukowiec, Arkadiusz; Doligalski, Michał

    2015-09-01

    There is presented new method of synthesis of application specific logic controllers for FPGA devices. The specification of control algorithm is made with use of control interpreted Petri net (PT type). It allows specifying parallel processes in easy way. The Petri net is decomposed into state-machine type subnets. In this case, each subnet represents one parallel process. For this purpose there are applied algorithms of coloring of Petri nets. There are presented two approaches of such decomposition: with doublers of macroplaces or with one global wait place. Next, subnets are implemented into two-level logic circuit of the controller. The levels of logic circuit are obtained as a result of its architectural decomposition. The first level combinational circuit is responsible for generation of next places and second level decoder is responsible for generation output symbols. There are worked out two variants of such circuits: with one shared operational memory or with many flexible distributed memories as a decoder. Variants of Petri net decomposition and structures of logic circuits can be combined together without any restrictions. It leads to existence of four variants of multi-variants synthesis.

  17. On the Computing Potential of Intracellular Vesicles

    PubMed Central

    Mayne, Richard; Adamatzky, Andrew

    2015-01-01

    Collision-based computing (CBC) is a form of unconventional computing in which travelling localisations represent data and conditional routing of signals determines the output state; collisions between localisations represent logical operations. We investigated patterns of Ca2+-containing vesicle distribution within a live organism, slime mould Physarum polycephalum, with confocal microscopy and observed them colliding regularly. Vesicles travel down cytoskeletal ‘circuitry’ and their collisions may result in reflection, fusion or annihilation. We demonstrate through experimental observations that naturally-occurring vesicle dynamics may be characterised as a computationally-universal set of Boolean logical operations and present a ‘vesicle modification’ of the archetypal CBC ‘billiard ball model’ of computation. We proceed to discuss the viability of intracellular vesicles as an unconventional computing substrate in which we delineate practical considerations for reliable vesicle ‘programming’ in both in vivo and in vitro vesicle computing architectures and present optimised designs for both single logical gates and combinatorial logic circuits based on cytoskeletal network conformations. The results presented here demonstrate the first characterisation of intracelluar phenomena as collision-based computing and hence the viability of biological substrates for computing. PMID:26431435

  18. CRANS - CONFIGURABLE REAL-TIME ANALYSIS SYSTEM

    NASA Technical Reports Server (NTRS)

    Mccluney, K.

    1994-01-01

    In a real-time environment, the results of changes or failures in a complex, interconnected system need evaluation quickly. Tabulations showing the effects of changes and/or failures of a given item in the system are generally only useful for a single input, and only with regard to that item. Subsequent changes become harder to evaluate as combinations of failures produce a cascade effect. When confronted by multiple indicated failures in the system, it becomes necessary to determine a single cause. In this case, failure tables are not very helpful. CRANS, the Configurable Real-time ANalysis System, can interpret a logic tree, constructed by the user, describing a complex system and determine the effects of changes and failures in it. Items in the tree are related to each other by Boolean operators. The user is then able to change the state of these items (ON/OFF FAILED/UNFAILED). The program then evaluates the logic tree based on these changes and determines any resultant changes to other items in the tree. CRANS can also search for a common cause for multiple item failures, and allow the user to explore the logic tree from within the program. A "help" mode and a reference check provide the user with a means of exploring an item's underlying logic from within the program. A commonality check determines single point failures for an item or group of items. Output is in the form of a user-defined matrix or matrices of colored boxes, each box representing an item or set of items from the logic tree. Input is via mouse selection of the matrix boxes, using the mouse buttons to toggle the state of the item. CRANS is written in C-language and requires the MIT X Window System, Version 11 Revision 4 or Revision 5. It requires 78K of RAM for execution and a three button mouse. It has been successfully implemented on Sun4 workstations running SunOS, HP9000 workstations running HP-UX, and DECstations running ULTRIX. No executable is provided on the distribution medium; however, a sample makefile is included. Sample input files are also included. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. This program was developed in 1992.

  19. Different methodologies to quantify uncertainties of air emissions.

    PubMed

    Romano, Daniela; Bernetti, Antonella; De Lauretis, Riccardo

    2004-10-01

    Characterization of the uncertainty associated with air emission estimates is of critical importance especially in the compilation of air emission inventories. In this paper, two different theories are discussed and applied to evaluate air emissions uncertainty. In addition to numerical analysis, which is also recommended in the framework of the United Nation Convention on Climate Change guidelines with reference to Monte Carlo and Bootstrap simulation models, fuzzy analysis is also proposed. The methodologies are discussed and applied to an Italian example case study. Air concentration values are measured from two electric power plants: a coal plant, consisting of two boilers and a fuel oil plant, of four boilers; the pollutants considered are sulphur dioxide (SO(2)), nitrogen oxides (NO(X)), carbon monoxide (CO) and particulate matter (PM). Monte Carlo, Bootstrap and fuzzy methods have been applied to estimate uncertainty of these data. Regarding Monte Carlo, the most accurate results apply to Gaussian distributions; a good approximation is also observed for other distributions with almost regular features either positive asymmetrical or negative asymmetrical. Bootstrap, on the other hand, gives a good uncertainty estimation for irregular and asymmetrical distributions. The logic of fuzzy analysis, where data are represented as vague and indefinite in opposition to the traditional conception of neatness, certain classification and exactness of the data, follows a different description. In addition to randomness (stochastic variability) only, fuzzy theory deals with imprecision (vagueness) of data. Fuzzy variance of the data set was calculated; the results cannot be directly compared with empirical data but the overall performance of the theory is analysed. Fuzzy theory may appear more suitable for qualitative reasoning than for a quantitative estimation of uncertainty, but it suits well when little information and few measurements are available and when distributions of data are not properly known.

  20. Comparing host and target environments for distributed Ada programs

    NASA Technical Reports Server (NTRS)

    Paulk, Mark C.

    1986-01-01

    The Ada programming language provides a means of specifying logical concurrency by using multitasking. Extending the Ada multitasking concurrency mechanism into a physically concurrent distributed environment which imposes its own requirements can lead to incompatibilities. These problems are discussed. Using distributed Ada for a target system may be appropriate, but when using the Ada language in a host environment, a multiprocessing model may be more suitable than retargeting an Ada compiler for the distributed environment. The tradeoffs between multitasking on distributed targets and multiprocessing on distributed hosts are discussed. Comparisons of the multitasking and multiprocessing models indicate different areas of application.

  1. The Accuracy Of Fuzzy Sugeno Method With Antropometry On Determination Natural Patient Status

    NASA Astrophysics Data System (ADS)

    Syahputra, Dinur; Tulus; Sawaluddin

    2017-12-01

    Anthropometry is one of the processes that can be used to assess nutritional status. In general anthropometry is defined as body size in terms of nutrition, then anthropometry is reviewed from various age levels and nutritional levels. Nutritional status is a description of the balance between nutritional intake with the needs of the body individually. Fuzzy logic is a logic that has a vagueness between right and wrong or between 0 and 1. Sugeno method is used because in the process of calculating nutritional status so far is still done by anthropometry. Currently information technology is growing in any aspect, one of them in the aspect of calculation with data taken from anthropometry. In this case the calculation can use the Fuzzy Sugeno Method, in order to know the great accuracy obtained. Then the results obtained using fuzzy sugeno integrated with anthropometry has an accuracy of 81.48%.

  2. Modifications to the streamtube curvature program. Volume 1: Program modifications and user's manual. [user manuals (computer programs) for transonic flow of nacelles and intake systems of turbofan engines

    NASA Technical Reports Server (NTRS)

    Ferguson, D. R.; Keith, J. S.

    1975-01-01

    The improvements which have been incorporated in the Streamtube Curvature Program to enhance both its computational and diagnostic capabilities are described. Detailed descriptions are given of the revisions incorporated to more reliably handle the jet stream-external flow interaction at trailing edges. Also presented are the augmented boundary layer procedures and a variety of other program changes relating to program diagnostics and extended solution capabilities. An updated User's Manual, that includes information on the computer program operation, usage, and logical structure, is presented. User documentation includes an outline of the general logical flow of the program and detailed instructions for program usage and operation. From the standpoint of the programmer, the overlay structure is described. The input data, output formats, and diagnostic printouts are covered in detail and illustrated with three typical test cases.

  3. A description of the thruster attitude control simulation and its application to the HEAO-C study

    NASA Technical Reports Server (NTRS)

    Brandon, L. B.

    1971-01-01

    During the design and evaluation of a reaction control system (RCS), it is desirable to have a digital computer program simulating vehicle dynamics, disturbance torques, control torques, and RCS logic. The thruster attitude control simulation (TACS) is just such a computer program. The TACS is a relatively sophisticated digital computer program that includes all the major parameters involved in the attitude control of a vehicle using an RCS for control. It includes the effects of gravity gradient torques and HEAO-C aerodynamic torques so that realistic runs can be made in the areas of fuel consumption and engine actuation rates. Also, the program is general enough that any engine configuration and logic scheme can be implemented in a reasonable amount of time. The results of the application of the TACS in the HEAO-C study are included.

  4. The Pitfalls of Thesaurus Ontologization – the Case of the NCI Thesaurus

    PubMed Central

    Schulz, Stefan; Schober, Daniel; Tudose, Ilinca; Stenzhorn, Holger

    2010-01-01

    Thesauri that are “ontologized” into OWL-DL semantics are highly amenable to modeling errors resulting from falsely interpreting existential restrictions. We investigated the OWL-DL representation of the NCI Thesaurus (NCIT) in order to assess the correctness of existential restrictions. A random sample of 354 axioms using the someValuesFrom operator was taken. According to a rating performed by two domain experts, roughly half of these examples, and in consequence more than 76,000 axioms in the OWL-DL version, make incorrect assertions if interpreted according to description logics semantics. These axioms therefore constitute a huge source for unintended models, rendering most logic-based reasoning unreliable. After identifying typical error patterns we discuss some possible improvements. Our recommendation is to either amend the problematic axioms in the OWL-DL formalization or to consider some less strict representational format. PMID:21347074

  5. Quantum theory as the most robust description of reproducible experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Katsnelson, Mikhail I.; Michielsen, Kristel

    2014-08-01

    It is shown that the basic equations of quantum theory can be obtained from a straightforward application of logical inference to experiments for which there is uncertainty about individual events and for which the frequencies of the observed events are robust with respect to small changes in the conditions under which the experiments are carried out. There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature [45]. Physics is to be regarded not so much as the study of something a priori given, but rather as the development of methods of ordering and surveying human experience. In this respect our task must be to account for such experience in a manner independent of individual subjective judgment and therefore objective in the sense that it can be unambiguously communicated in ordinary human language [46]. The physical content of quantum mechanics is exhausted by its power to formulate statistical laws governing observations under conditions specified in plain language [46]. The first two sentences of the first quote may be read as a suggestion to dispose of, in Mermin's words [47], the "bad habit" to take mathematical abstractions as the reality of the events (in the everyday sense of the word) that we experience through our senses. Although widely circulated, these sentences are reported by Petersen [45] and there is doubt that Bohr actually used this wording [48]. The last two sentences of the first quote and the second quote suggest that we should try to describe human experiences (confined to the realm of scientific inquiry) in a manner and language which is unambiguous and independent of the individual subjective judgment. Of course, the latter should not be construed to imply that the observed phenomena are independent of the choices made by the individual(s) in performing the scientific experiment [49].The third quote suggests that quantum theory is a powerful language to describe a certain class of statistical experiments but remains vague about the properties of the class. Similar views were expressed by other fathers of quantum mechanics, e.g., Max Born and Wolfgang Pauli [50]. They can be summarized as "Quantum theory describes our knowledge of the atomic phenomena rather than the atomic phenomena themselves". Our aim is, in a sense, to replace the philosophical components of these statements by well-defined mathematical concepts and to carefully study their relevance for physical phenomena. Specifically, by applying the general formalism of logical inference to a well-defined class of statistical experiments, the present paper shows that quantum theory is indeed the kind of language envisaged by Bohr.Theories such as Newtonian mechanics, Maxwell's electrodynamics, and Einstein's (general) relativity are deductive in character. Starting from a few axioms, abstracted from experimental observations and additional assumptions about the irrelevance of a large number of factors for the description of the phenomena of interest, deductive reasoning is used to prove or disprove unambiguous statements, propositions, about the mathematical objects which appear in the theory.The method of deductive reasoning conforms to the Boolean algebra of propositions. The deductive, reductionist methodology has the appealing feature that one can be sure that the propositions are either right or wrong, and disregarding the possibility that some of the premises on which the deduction is built may not apply, there is no doubt that the conclusions are correct. Clearly, these theories successfully describe a wide range of physical phenomena in a manner and language which is unambiguous and independent of the individual.At the same time, the construction of a physical theory, and a scientific theory in general, from "first principles" is, for sure, not something self-evident, and not even safe. Our basic knowledge always starts from the middle, that is, from the world of macroscopic objects. According to Bohr, the quantum theoretical description crucially depends on the existence of macroscopic objects which can be used as measuring devices. For an extensive analysis of the quantum measurement process from a dynamical point of view see Ref. [51]. Most importantly, the description of the macroscopic level is robust, that is, essentially independent of the underlying "more fundamental" picture [2]. As will be seen later, formalizing the notion of "robustness" is key to derive the basic equations of quantum theory from the general framework of logical inference.Key assumptions of the deductive approach are that the mathematical description is a complete description of the experiment under consideration and that there is no uncertainty about the conditions under which the experiment is carried out. If the theory does not fully account for all the relevant aspects of the phenomenon that we wish to describe, the general rules by which we deduce whether a proposition is true or false can no longer be used. However, in these circumstances, we can still resort to logical inference [37-41] to find useful answers to unambiguous questions. Of course, in general it will no longer be possible to say whether a proposition is true or false, hence there will always remain a residue of doubt. However, as will be shown, the description obtained through logical inference may also be unambiguous and independent of the individual.In the present paper, we demonstrate that the basic equations of quantum theory directly follow from logical inference applied to experiments in which there is uncertainty about individual events, the stringent condition that certain properties of the collection of events are reproducible, meaning that they are robust with respect to small changes in the conditions under which the experiments are carried out.

  6. A distributed parallel storage architecture and its potential application within EOSDIS

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Tierney, Brian; Feuquay, Jay; Butzer, Tony

    1994-01-01

    We describe the architecture, implementation, use of a scalable, high performance, distributed-parallel data storage system developed in the ARPA funded MAGIC gigabit testbed. A collection of wide area distributed disk servers operate in parallel to provide logical block level access to large data sets. Operated primarily as a network-based cache, the architecture supports cooperation among independently owned resources to provide fast, large-scale, on-demand storage to support data handling, simulation, and computation.

  7. Design Time Optimization for Hardware Watermarking Protection of HDL Designs

    PubMed Central

    Castillo, E.; Morales, D. P.; García, A.; Parrilla, L.; Todorovich, E.; Meyer-Baese, U.

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time. PMID:25861681

  8. A logical model of cooperating rule-based systems

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.

    1989-01-01

    A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.

  9. The past, present and future of cyber-physical systems: a focus on models.

    PubMed

    Lee, Edward A

    2015-02-26

    This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.

  10. The Past, Present and Future of Cyber-Physical Systems: A Focus on Models

    PubMed Central

    Lee, Edward A.

    2015-01-01

    This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486

  11. MicrO: an ontology of phenotypic and metabolic characters, assays, and culture media found in prokaryotic taxonomic descriptions.

    PubMed

    Blank, Carrine E; Cui, Hong; Moore, Lisa R; Walls, Ramona L

    2016-01-01

    MicrO is an ontology of microbiological terms, including prokaryotic qualities and processes, material entities (such as cell components), chemical entities (such as microbiological culture media and medium ingredients), and assays. The ontology was built to support the ongoing development of a natural language processing algorithm, MicroPIE (or, Microbial Phenomics Information Extractor). During the MicroPIE design process, we realized there was a need for a prokaryotic ontology which would capture the evolutionary diversity of phenotypes and metabolic processes across the tree of life, capture the diversity of synonyms and information contained in the taxonomic literature, and relate microbiological entities and processes to terms in a large number of other ontologies, most particularly the Gene Ontology (GO), the Phenotypic Quality Ontology (PATO), and the Chemical Entities of Biological Interest (ChEBI). We thus constructed MicrO to be rich in logical axioms and synonyms gathered from the taxonomic literature. MicrO currently has ~14550 classes (~2550 of which are new, the remainder being microbiologically-relevant classes imported from other ontologies), connected by ~24,130 logical axioms (5,446 of which are new), and is available at (http://purl.obolibrary.org/obo/MicrO.owl) and on the project website at https://github.com/carrineblank/MicrO. MicrO has been integrated into the OBO Foundry Library (http://www.obofoundry.org/ontology/micro.html), so that other ontologies can borrow and re-use classes. Term requests and user feedback can be made using MicrO's Issue Tracker in GitHub. We designed MicrO such that it can support the ongoing and future development of algorithms that can leverage the controlled vocabulary and logical inference power provided by the ontology. By connecting microbial classes with large numbers of chemical entities, material entities, biological processes, molecular functions, and qualities using a dense array of logical axioms, we intend MicrO to be a powerful new tool to increase the computing power of bioinformatics tools such as the automated text mining of prokaryotic taxonomic descriptions using natural language processing. We also intend MicrO to support the development of new bioinformatics tools that aim to develop new connections between microbial phenotypes and genotypes (i.e., the gene content in genomes). Future ontology development will include incorporation of pathogenic phenotypes and prokaryotic habitats.

  12. Performance bounds on parallel self-initiating discrete-event

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.

  13. System and method for programmable bank selection for banked memory subsystems

    DOEpatents

    Blumrich, Matthias A.; Chen, Dong; Gara, Alan G.; Giampapa, Mark E.; Hoenicke, Dirk; Ohmacht, Martin; Salapura, Valentina; Sugavanam, Krishnan

    2010-09-07

    A programmable memory system and method for enabling one or more processor devices access to shared memory in a computing environment, the shared memory including one or more memory storage structures having addressable locations for storing data. The system comprises: one or more first logic devices associated with a respective one or more processor devices, each first logic device for receiving physical memory address signals and programmable for generating a respective memory storage structure select signal upon receipt of pre-determined address bit values at selected physical memory address bit locations; and, a second logic device responsive to each of the respective select signal for generating an address signal used for selecting a memory storage structure for processor access. The system thus enables each processor device of a computing environment memory storage access distributed across the one or more memory storage structures.

  14. Ascertaining and Graphically Representing the Logical Structure of Japanese Essays

    NASA Astrophysics Data System (ADS)

    Ishioka, Tsunenori

    To more accurately assess the logical structure of Japanese essays, I have devised a technique that uses end-of-sentence modality and demonstrative pronouns referencing earlier paragraphs as new indicators of structure in addition to conjunctive expressions which have hitherto often used for Japanese as well as for European languages. It is hoped that this will yield better results because conjunctive expressions are intentionally avoided in Japanese. I applied this technique to the editorial and commentary (Yoroku) columns of the Mainichi Shimbun newspaper and used it to represent the structure and development of the arguments made by these articles in the form of constellation diagrams which are used in the field of statistics. As a result, I found that this graph is useful in that it enables the overall distribution to be ascertained, and allows the temporal changes in the logical structure of the data in question to be ascertained.

  15. Semantics of Inheritance and Attributions in the Description System Omega. Revision.

    DTIC Science & Technology

    1982-01-01

    criticized either for being too general or too inflexible. General deductive procedures for predicate logic easily go out of control. There have been a...6 and 7. Section 8 is concerned with a discussion of attributions: the language is extended by introducing syntactic notations for four different...for valid formulas of the theory. The models prorided earlier are usefulL in this stage for providing a guideline and a criterion for suitability of

  16. A Model for Predicting Integrated Man-Machine System Reliability: Model Logic and Description

    DTIC Science & Technology

    1974-11-01

    3. Fatigue buildup curve. The common requirement of all tests on the Dynamic Strength factor is for the muscles involved to propel, support, or...move the body repeatedly or to support it continuously over time. The tests of our Static Strength factor emphasize the lifting power of the muscles ...or the pounds of pressure which the muscles can exert. ... In contrast to Dynamic Strength the force exerted is against external objects, rather

  17. Distributed morality in an information society.

    PubMed

    Floridi, Luciano

    2013-09-01

    The phenomenon of distributed knowledge is well-known in epistemic logic. In this paper, a similar phenomenon in ethics, somewhat neglected so far, is investigated, namely distributed morality. The article explains the nature of distributed morality, as a feature of moral agency, and explores the implications of its occurrence in advanced information societies. In the course of the analysis, the concept of infraethics is introduced, in order to refer to the ensemble of moral enablers, which, although morally neutral per se, can significantly facilitate or hinder both positive and negative moral behaviours.

  18. Agility assessment using fuzzy logic approach: a case of healthcare dispensary.

    PubMed

    Suresh, M; Patri, Rojalin

    2017-06-09

    Agile concepts are not only beneficial for manufacturing sector but also for service sector such as healthcare. However, assessment of agility has been predominantly done in manufacturing enterprises. This study demonstrates a means to measure agility of a healthcare organization by assessing agility of a university dispensary. Its contribution to the knowledge base is twofold. First, it proposes a means to measure the agility of a healthcare organization and second, it identifies the attributes that prevent agile performance and outlines the suggestive measure to enhance its agile capabilities. A case study approach has been adopted and fuzzy logic has been employed to measure the agility of the case dispensary. At first, the measures of assessment which include four enablers, fifteen criteria and forty-five attributes have been identified from the literature and rated by the experts indicating the importance of the measures in the assessment. Then, the case dispensary has been assessed on those measures by collecting observed performance rating from decision makers. At last, Fuzzy logic has been applied on the performance rating data to analyze and interpret the agile capability of the dispensary. The findings suggest that transparent information flow, adequate salary and bonuses for caregivers, reading error in medical descriptions, in house/nearby pathology laboratory services, technical up-gradation of dispensary equipments and facilities, minimization of patient throughput time and adequate training programme for safety practices are the attributes that weakens agile capability of the University dispensary. The current agility of the dispensary was found to be 'Agile' which is average in relation to the agility labels. Attributes such as transparent information flow, adequate salary and bonuses for caregivers, elimination of reading error in medical descriptions, in house/nearby pathology laboratory services, technical up-gradation of dispensary equipments and facilities, minimization of patient throughput time and adequate training programme for safety practices are extremely crucial for enhancing agile capability of a healthcare organization.

  19. Scheme for Quantum Computing Immune to Decoherence

    NASA Technical Reports Server (NTRS)

    Williams, Colin; Vatan, Farrokh

    2008-01-01

    A constructive scheme has been devised to enable mapping of any quantum computation into a spintronic circuit in which the computation is encoded in a basis that is, in principle, immune to quantum decoherence. The scheme is implemented by an algorithm that utilizes multiple physical spins to encode each logical bit in such a way that collective errors affecting all the physical spins do not disturb the logical bit. The scheme is expected to be of use to experimenters working on spintronic implementations of quantum logic. Spintronic computing devices use quantum-mechanical spins (typically, electron spins) to encode logical bits. Bits thus encoded (denoted qubits) are potentially susceptible to errors caused by noise and decoherence. The traditional model of quantum computation is based partly on the assumption that each qubit is implemented by use of a single two-state quantum system, such as an electron or other spin-1.2 particle. It can be surprisingly difficult to achieve certain gate operations . most notably, those of arbitrary 1-qubit gates . in spintronic hardware according to this model. However, ironically, certain 2-qubit interactions (in particular, spin-spin exchange interactions) can be achieved relatively easily in spintronic hardware. Therefore, it would be fortunate if it were possible to implement any 1-qubit gate by use of a spin-spin exchange interaction. While such a direct representation is not possible, it is possible to achieve an arbitrary 1-qubit gate indirectly by means of a sequence of four spin-spin exchange interactions, which could be implemented by use of four exchange gates. Accordingly, the present scheme provides for mapping any 1-qubit gate in the logical basis into an equivalent sequence of at most four spin-spin exchange interactions in the physical (encoded) basis. The complexity of the mathematical derivation of the scheme from basic quantum principles precludes a description within this article; it must suffice to report that the derivation provides explicit constructions for finding the exchange couplings in the physical basis needed to implement any arbitrary 1-qubit gate. These constructions lead to spintronic encodings of quantum logic that are more efficient than those of a previously published scheme that utilizes a universal but fixed set of gates.

  20. Construction and Operation of Three-Dimensional Memory and Logic Molecular Devices and Circuits

    DTIC Science & Technology

    2013-07-01

    higher currents and less leakage. We also constructed a ferrocene -based self-assembling monolayer attached to gold nanoparticles, exhibiting a...charging transistor utilizing Ferrocene -based SAM attached to gold nano-particle. Our experiments are, to our knowledge, the first to exhibit an...The molecular layer includes a ferrocene SAM attached to Au Distribution A: Approved for public release; distribution is unlimited

  1. Automated identification of protein-ligand interaction features using Inductive Logic Programming: a hexose binding case study.

    PubMed

    A Santos, Jose C; Nassif, Houssam; Page, David; Muggleton, Stephen H; E Sternberg, Michael J

    2012-07-11

    There is a need for automated methods to learn general features of the interactions of a ligand class with its diverse set of protein receptors. An appropriate machine learning approach is Inductive Logic Programming (ILP), which automatically generates comprehensible rules in addition to prediction. The development of ILP systems which can learn rules of the complexity required for studies on protein structure remains a challenge. In this work we use a new ILP system, ProGolem, and demonstrate its performance on learning features of hexose-protein interactions. The rules induced by ProGolem detect interactions mediated by aromatics and by planar-polar residues, in addition to less common features such as the aromatic sandwich. The rules also reveal a previously unreported dependency for residues cys and leu. They also specify interactions involving aromatic and hydrogen bonding residues. This paper shows that Inductive Logic Programming implemented in ProGolem can derive rules giving structural features of protein/ligand interactions. Several of these rules are consistent with descriptions in the literature. In addition to confirming literature results, ProGolem's model has a 10-fold cross-validated predictive accuracy that is superior, at the 95% confidence level, to another ILP system previously used to study protein/hexose interactions and is comparable with state-of-the-art statistical learners.

  2. Models in biology: ‘accurate descriptions of our pathetic thinking’

    PubMed Central

    2014-01-01

    In this essay I will sketch some ideas for how to think about models in biology. I will begin by trying to dispel the myth that quantitative modeling is somehow foreign to biology. I will then point out the distinction between forward and reverse modeling and focus thereafter on the former. Instead of going into mathematical technicalities about different varieties of models, I will focus on their logical structure, in terms of assumptions and conclusions. A model is a logical machine for deducing the latter from the former. If the model is correct, then, if you believe its assumptions, you must, as a matter of logic, also believe its conclusions. This leads to consideration of the assumptions underlying models. If these are based on fundamental physical laws, then it may be reasonable to treat the model as ‘predictive’, in the sense that it is not subject to falsification and we can rely on its conclusions. However, at the molecular level, models are more often derived from phenomenology and guesswork. In this case, the model is a test of its assumptions and must be falsifiable. I will discuss three models from this perspective, each of which yields biological insights, and this will lead to some guidelines for prospective model builders. PMID:24886484

  3. Novel Designs of Quantum Reversible Counters

    NASA Astrophysics Data System (ADS)

    Qi, Xuemei; Zhu, Haihong; Chen, Fulong; Zhu, Junru; Zhang, Ziyang

    2016-11-01

    Reversible logic, as an interesting and important issue, has been widely used in designing combinational and sequential circuits for low-power and high-speed computation. Though a significant number of works have been done on reversible combinational logic, the realization of reversible sequential circuit is still at premature stage. Reversible counter is not only an important part of the sequential circuit but also an essential part of the quantum circuit system. In this paper, we designed two kinds of novel reversible counters. In order to construct counter, the innovative reversible T Flip-flop Gate (TFG), T Flip-flop block (T_FF) and JK flip-flop block (JK_FF) are proposed. Based on the above blocks and some existing reversible gates, the 4-bit binary-coded decimal (BCD) counter and controlled Up/Down synchronous counter are designed. With the help of Verilog hardware description language (Verilog HDL), these counters above have been modeled and confirmed. According to the simulation results, our circuits' logic structures are validated. Compared to the existing ones in terms of quantum cost (QC), delay (DL) and garbage outputs (GBO), it can be concluded that our designs perform better than the others. There is no doubt that they can be used as a kind of important storage components to be applied in future low-power computing systems.

  4. The Influence of Assortativity on the Robustness of Signal-Integration Logic in Gene Regulatory Networks

    PubMed Central

    Pechenick, Dov A.; Payne, Joshua L.; Moore, Jason H.

    2011-01-01

    Gene regulatory networks (GRNs) drive the cellular processes that sustain life. To do so reliably, GRNs must be robust to perturbations, such as gene deletion and the addition or removal of regulatory interactions. GRNs must also be robust to genetic changes in regulatory regions that define the logic of signal-integration, as these changes can affect how specific combinations of regulatory signals are mapped to particular gene expression states. Previous theoretical analyses have demonstrated that the robustness of a GRN is influenced by its underlying topological properties, such as degree distribution and modularity. Another important topological property is assortativity, which measures the propensity with which nodes of similar connectivity are connected to one another. How assortativity influences the robustness of the signal-integration logic of GRNs remains an open question. Here, we use computational models of GRNs to investigate this relationship. We separately consider each of the three dynamical regimes of this model for a variety of degree distributions. We find that in the chaotic regime, robustness exhibits a pronounced increase as assortativity becomes more positive, while in the critical and ordered regimes, robustness is generally less sensitive to changes in assortativity. We attribute the increased robustness to a decrease in the duration of the gene expression pattern, which is caused by a reduction in the average size of a GRN’s in-components. This study provides the first direct evidence that assortativity influences the robustness of the signal-integration logic of computational models of GRNs, illuminates a mechanistic explanation for this influence, and furthers our understanding of the relationship between topology and robustness in complex biological systems. PMID:22155134

  5. 76 FR 20179 - Endangered and Threatened Species: Designation of Critical Habitat for Cook Inlet Beluga Whale

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ..., these descriptions are general in nature and, we believe, far less descriptive than those presented in... distribution inlets is more descriptive of the actual distribution of these whales and the essential feature... anadromous fish utilizing these waters would not change the list, but could only add another descriptive...

  6. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot

    PubMed Central

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R.; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-01-01

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm. PMID:27618062

  7. GENERAL A Hierarchy of Compatibility and Comeasurability Levels in Quantum Logics with Unique Conditional Probabilities

    NASA Astrophysics Data System (ADS)

    Gerd, Niestegge

    2010-12-01

    In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lüders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases.

  8. Chip-set for quality of service support in passive optical networks

    NASA Astrophysics Data System (ADS)

    Ringoot, Edwin; Hoebeke, Rudy; Slabbinck, B. Hans; Verhaert, Michel

    1998-10-01

    In this paper the design of a chip-set for QoS provisioning in ATM-based Passive Optical Networks is discussed. The implementation of a general-purpose switch chip on the Optical Network Unit is presented, with focus on the design of the cell scheduling and buffer management logic. The cell scheduling logic supports `colored' grants, priority jumping and weighted round-robin scheduling. The switch chip offers powerful buffer management capabilities enabling the efficient support of GFR and UBR services. Multicast forwarding is also supported. In addition, the architecture of a MAC controller chip developed for a SuperPON access network is introduced. In particular, the permit scheduling logic and its implementation on the Optical Line Termination will be discussed. The chip-set enables the efficient support of services with different service requirements on the SuperPON. The permit scheduling logic built into the MAC controller chip in combination with the cell scheduling and buffer management capabilities of the switch chip can be used by network operators to offer guaranteed service performance to delay sensitive services, and to efficiently and fairly distribute any spare capacity to delay insensitive services.

  9. Cognitive and attitudinal predictors related to graphing achievement among pre-service elementary teachers

    NASA Astrophysics Data System (ADS)

    Szyjka, Sebastian P.

    The purpose of this study was to determine the extent to which six cognitive and attitudinal variables predicted pre-service elementary teachers' performance on line graphing. Predictors included Illinois teacher education basic skills sub-component scores in reading comprehension and mathematics, logical thinking performance scores, as well as measures of attitudes toward science, mathematics and graphing. This study also determined the strength of the relationship between each prospective predictor variable and the line graphing performance variable, as well as the extent to which measures of attitude towards science, mathematics and graphing mediated relationships between scores on mathematics, reading, logical thinking and line graphing. Ninety-four pre-service elementary education teachers enrolled in two different elementary science methods courses during the spring 2009 semester at Southern Illinois University Carbondale participated in this study. Each subject completed five different instruments designed to assess science, mathematics and graphing attitudes as well as logical thinking and graphing ability. Sixty subjects provided copies of primary basic skills score reports that listed subset scores for both reading comprehension and mathematics. The remaining scores were supplied by a faculty member who had access to a database from which the scores were drawn. Seven subjects, whose scores could not be found, were eliminated from final data analysis. Confirmatory factor analysis (CFA) was conducted in order to establish validity and reliability of the Questionnaire of Attitude Toward Line Graphs in Science (QALGS) instrument. CFA tested the statistical hypothesis that the five main factor structures within the Questionnaire of Attitude Toward Statistical Graphs (QASG) would be maintained in the revised QALGS. Stepwise Regression Analysis with backward elimination was conducted in order to generate a parsimonious and precise predictive model. This procedure allowed the researcher to explore the relationships among the affective and cognitive variables that were included in the regression analysis. The results for CFA indicated that the revised QALGS measure was sound in its psychometric properties when tested against the QASG. Reliability statistics indicated that the overall reliability for the 32 items in the QALGS was .90. The learning preferences construct had the lowest reliability (.67), while enjoyment (.89), confidence (.86) and usefulness (.77) constructs had moderate to high reliabilities. The first four measurement models fit the data well as indicated by the appropriate descriptive and statistical indices. However, the fifth measurement model did not fit the data well statistically, and only fit well with two descriptive indices. The results addressing the research question indicated that mathematical and logical thinking ability were significant predictors of line graph performance among the remaining group of variables. These predictors accounted for 41% of the total variability on the line graph performance variable. Partial correlation coefficients indicated that mathematics ability accounted for 20.5% of the variance on the line graphing performance variable when removing the effect of logical thinking. The logical thinking variable accounted for 4.7% of the variance on the line graphing performance variable when removing the effect of mathematics ability.

  10. Architecture for VLSI design of Reed-Solomon encoders

    NASA Technical Reports Server (NTRS)

    Liu, K. Y.

    1982-01-01

    A description is given of the logic structure of the universal VLSI symbol-slice Reed-Solomon (RS) encoder chip, from a group of which an RS encoder may be constructed through cascading and proper interconnection. As a design example, it is shown that an RS encoder presently requiring approximately 40 discrete CMOS ICs may be replaced by an RS encoder consisting of four identical, interconnected VLSI RS encoder chips, offering in addition to greater compactness both a lower power requirement and greater reliability.

  11. Architecture for VLSI design of Reed-Solomon encoders

    NASA Astrophysics Data System (ADS)

    Liu, K. Y.

    1982-02-01

    A description is given of the logic structure of the universal VLSI symbol-slice Reed-Solomon (RS) encoder chip, from a group of which an RS encoder may be constructed through cascading and proper interconnection. As a design example, it is shown that an RS encoder presently requiring approximately 40 discrete CMOS ICs may be replaced by an RS encoder consisting of four identical, interconnected VLSI RS encoder chips, offering in addition to greater compactness both a lower power requirement and greater reliability.

  12. A knowledge representation view on biomedical structure and function.

    PubMed Central

    Schulz, Stefan; Hahn, Udo

    2002-01-01

    In biomedical ontologies, structural and functional considerations are of outstanding importance, and concepts which belong to these two categories are highly interdependent. At the representational level both axes must be clearly kept separate in order to support disciplined ontology engineering. Furthermore, the biaxial organization of physical structure (both by a taxonomic and partonomic order) entails intricate patterns of inference. We here propose a layered encoding of taxonomic, partonomic and functional aspects of biomedical concepts using description logics. PMID:12463912

  13. The MK VI - A second generation attitude control system

    NASA Astrophysics Data System (ADS)

    Meredith, P. J.

    1986-10-01

    The MK VI, a new multipurpose attitude control system for the exoatmospheric attitude control of sounding rocket payloads, is described. The system employs reprogrammable microcomputer memory for storage of basic control logic and for specific mission event control data. The paper includes descriptions of MK VI specifications and configuration; sensor characteristics; the electronic, analog, and digital sections; the pneumatic system; ground equipment; the system operation; and software. A review of the MK VI performance for the Comet Halley flight is presented. Block diagrams are included.

  14. Logic Design of a Shared Disk System in a Multi-Micro Computer Environment.

    DTIC Science & Technology

    1983-06-01

    overall system, is given. An exnaustive description of eacn device can De found in tne cited references. A. INTEL 80S5 Tne INTEL Be86 is a nign...eitner could De accomplished, it was necessary to understand ootn tne existing system arcnitecture ani software. Tne last cnapter addressed tnat...to De adapted: tne loader program and tne Doot ROP program. Tne loader program is a simplified version of CP/M-Bö and contains cniy encu^n file

  15. Linear-time general decoding algorithm for the surface code

    NASA Astrophysics Data System (ADS)

    Darmawan, Andrew S.; Poulin, David

    2018-05-01

    A quantum error correcting protocol can be substantially improved by taking into account features of the physical noise process. We present an efficient decoder for the surface code which can account for general noise features, including coherences and correlations. We demonstrate that the decoder significantly outperforms the conventional matching algorithm on a variety of noise models, including non-Pauli noise and spatially correlated noise. The algorithm is based on an approximate calculation of the logical channel using a tensor-network description of the noisy state.

  16. Mathematical Description Development of Reactions of Metallic Gallium Using Kinetic Block Diagram

    NASA Astrophysics Data System (ADS)

    Yakovleva, A. A.; Soboleva, V. G.; Filatova, E. G.

    2018-05-01

    A kinetic block diagram based on a logical sequence of actions in the mathematical processing of a kinetic data is used. A type of reactions of metallic gallium in hydrochloric acid solutions is determined. It has been established that the reactions of the formation of gallium oxide and its salts proceed independently and in the absence of the diffusion resistance. Kinetic models connecting the constants of the reaction rate with the activation energy and describing the evolution of the process are obtained.

  17. The Scientific Prototype - a proposed next step for the American MFE effort

    NASA Astrophysics Data System (ADS)

    Manheimer, Wallace

    2013-10-01

    The Scientific prototype is the only logical next step for the American magnetic fusion effort. This poster is divided into two parts. The first is a description of the scientific prototype, a tokamak about the size of TFTR, JET and JT-60, but which runs steady state in DT and breeds its own tritium. The second is an examination of other proposed approaches for American MFE and why none constitute a viable alternative. W. Manheimer, J. Fusion Energy, 32, 419-421, 2013.

  18. Methodological challenges in qualitative content analysis: A discussion paper.

    PubMed

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A meta-analysis of research on science teacher education practices associated with inquiry strategy

    NASA Astrophysics Data System (ADS)

    Sweitzer, Gary L.; Anderson, Ronald D.

    A meta-analysis was conducted of studies of teacher education having as measured outcomes one or more variables associated with inquiry teaching. Inquiry addresses those teacher behaviors that facilitate student acquisition of concepts and processes through strategies such as problem solving, uses of evidence, logical and analytical reasoning, clarification of values, and decision making. Studies which contained sufficient data for the calculation of an effect size were coded for 114 variables. These variables were divided into the following six major categories: study information and design characteristics, teacher and teacher trainee characteristics, student characteristics, treatment description, outcome description, and effect size calculation. A total of 68 studies resulting in 177 effect size calculations were coded. Mean effect sizes broken across selected variables were calculated.

  20. PC-402 Pioneer Venus orbiter spacecraft mission operational characteristics document

    NASA Technical Reports Server (NTRS)

    Barker, F. C.; Butterworth, L. W.; Daniel, R. E.; Drean, R. J.; Filetti, K. A.; Fisher, J. N.; Nowak, L. A.; Porzucki, J.; Salvatore, J. O.; Tadler, G. A.

    1978-01-01

    The operational characteristics of the Orbiter spacecraft and its subsystems are described. In extensive detail. Description of the nominal phases, system interfaces, and the capabilities and limitations of system level performance are included along with functional and operational descriptions at the subsystem and unit level the subtleties of nominal operation as well as detailed capabilities and limitations beyond nominal performance are discussed. A command and telemetry logic flow diagram for each subsystem is included. Each diagram encountered along each command signal path into, and each telemetry signal path out of the subsystem. Normal operating modes that correspond to the performance of specific functions at the time of specific events in the mission are also discussed. Principal backup means of performing the normal Orbiter operating modes are included.

  1. A model of evaluation planning, implementation and management: Toward a ?culture of information? within organizations

    NASA Astrophysics Data System (ADS)

    Bhola, H. S.

    1992-03-01

    The argument underlying the ongoing "paradigm shift" from logical positivism to constructionism is briefly laid out. A model of evaluation planning, implementation and management (called the P-I-M Model, for short) is then presented that assumes a complementarity between the two paradigms. The model further implies that for effective decision-making within human organizations, both "evaluative data" and "descriptive data" are needed. "Evaluative data" generated by evaluation studies must, therefore, be undergirded by an appropriate management information system (MIS) that can generate "descriptive data", concurrently with the process of program implementation. The P-I-M Model, if fully actualized, will enable human organizations to become vibrant "cultures of information" where "informed" decision-making becomes a shared norm among all stakeholders.

  2. An evaluation of medical knowledge contained in Wikipedia and its use in the LOINC database.

    PubMed

    Friedlin, Jeff; McDonald, Clement J

    2010-01-01

    The logical observation identifiers names and codes (LOINC) database contains 55 000 terms consisting of more atomic components called parts. LOINC carries more than 18 000 distinct parts. It is necessary to have definitions/descriptions for each of these parts to assist users in mapping local laboratory codes to LOINC. It is believed that much of this information can be obtained from the internet; the first effort was with Wikipedia. This project focused on 1705 laboratory analytes (the first part in the LOINC laboratory name). Of the 1705 parts queried, 1314 matching articles were found in Wikipedia. Of these, 1299 (98.9%) were perfect matches that exactly described the LOINC part, 15 (1.14%) were partial matches (the description in Wikipedia was related to the LOINC part, but did not describe it fully), and 102 (7.76%) were mis-matches. The current release of RELMA and LOINC include Wikipedia descriptions of LOINC parts obtained as a direct result of this project.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  4. Analyzing the Web Services and UniFrame Paradigms

    DTIC Science & Technology

    2003-04-01

    paradigm from a centralized one to a distributed one. Hence, the target environment is no more a centrally managed, but concerned with collaboration...lever (business logic level) and provide a new platform to build software for a distributed environment . UniFrame is a research project that aims to...EAI solutions provide tends to be complex and expensive, despite improving the overall communication. In addition, the EAI interfaces are not reusable

  5. Function Allocation in a Robust Distributed Real-Time Environment

    DTIC Science & Technology

    1991-12-01

    fundamental characteristic of a distributed system is its ability to map individual logical functions of an application program onto many physical nodes... how much of a node’s processor time is scheduled for function processing. IMC is the function- to -function communication required to facilitate...indicator of how much excess processor time a node has. The reconfiguration algorithms use these variables to determine the most appropriate node(s) to

  6. Logic Nanocells Within 3-Terminal Ordered Arrays

    DTIC Science & Technology

    2007-02-28

    DISTRIBUTION/AVAILABILITY STATEMENT DISTRIBUTION STATEMEN A: UNLIMITED AFRL- SR -AR-TR-07-0494 13. SUPPLEMENTARY NOTES 14. ABSTRACT ON SEPARATE SHEET... sputter -coating a 200 nm Au layer. Molecular grafting. Compounds 1, 2 and 3 were synthesized according to literature methods.24 26 The synthesis of 4...neutral (no counter ions ). In order to facilitate molecular conduction, the molecule was designed to be small and contain a continuous Tr-electron system

  7. Comparison of Communication Architectures and Network Topologies for Distributed Propulsion Controls (Preprint)

    DTIC Science & Technology

    2013-05-01

    logic to perform control function computations and are connected to the full authority digital engine control ( FADEC ) via a high-speed data...Digital Engine Control ( FADEC ) via a high speed data communication bus. The short term distributed engine control configu- rations will be core...concen- trator; and high temperature electronics, high speed communication bus between the data concentrator and the control law processor master FADEC

  8. Provably secure time distribution for the electric grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith IV, Amos M; Evans, Philip G; Williams, Brian P

    We demonstrate a quantum time distribution (QTD) method that combines the precision of optical timing techniques with the integrity of quantum key distribution (QKD). Critical infrastructure is dependent on microprocessor- and programmable logic-based monitoring and control systems. The distribution of timing information across the electric grid is accomplished by GPS signals which are known to be vulnerable to spoofing. We demonstrate a method for synchronizing remote clocks based on the arrival time of photons in a modifed QKD system. This has the advantage that the signal can be veried by examining the quantum states of the photons similar to QKD.

  9. The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilch, Martin M.

    Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less

  10. SPORTS PHYSICAL THERAPY CURRICULA IN PHYSICAL THERAPIST PROFESSIONAL DEGREE PROGRAMS.

    PubMed

    Mulligan, Edward P; DeVahl, Julie

    2017-10-01

    The specialty niche of sports physical therapy has grown at a significant rate over the past 40 years. Despite this growth there is little information or direction from the physical therapy education accreditation body or professional association to guide academic programs on the interest or necessity of this type of practice content in physical therapy professional degree programs. The purpose of this survey study is to report on the prevalence, attitudes, barriers, resources, and faculty expertise in providing required or elective sports physical therapy course work. Cross-sectional descriptive survey. A 57-item questionnaire with branching logic was distributed via a web-based electronic data capture tool to survey all Commission on Accreditation for Physical Therapy Education (CAPTE) accredited and candidate schools in the United States. Response data was analyzed to describe typical educational program profiles, faculty demographics, and correlational factors consistent with the presence or absence of specific sports physical therapy curricular content. Thirty one percent of the schools responded to the survey and the program demographics were consistent with all currently accredited schools in regards to their geography, Carnegie classification, and faculty and student size. Forty three percent of programs offered a required or elective course distinct to the practice of sports physical therapy. Descriptive information regarding the sequencing, curricular make-up, resources, and assessment of content competence is reported. The odds of providing this content nearly doubles for programs that have faculty with sports clinical specialist credentials, accredited sports residency curriculums, or state practice acts that allow sports venue coverage. This survey provides an initial overview of sports physical therapy educational efforts in professional physical therapy degree programs. The data can used to spur further discussion on the necessity, structure, and implementation of education content that is inherent to a growing specialty practice in the physical therapy profession. 4, Cross-sectional descriptive survey design.

  11. The restructuring and future of {open_quotes}Mendelian Inheritance in Man{close_quotes} (MIM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pearson, P.L.; Francomano, C.; Antonarakis, S.

    1994-09-01

    Victor McKusick`s catalog {open_quotes}Mendelian Inheritance in Man{close_quotes} represents the most comprehensive compendum of human genetic disease information available today and has appeared as a series in book form for the last 30 years. The 11th edition will contain almost 7000 entries: approximately 2800 descriptions of human genetic disorders, 700 combined disorder/gene descriptions and 3500 pure gene descriptions. Until recently the content of the catalogs was maintained solely by McKusick with a support staff. However, a distributed editing system has now been established with the following primary components. New entries are initiated in Baltimore by science writers under the guidance ofmore » the senior editors and McKusick, following which the information is made immediately available to the public through online access. The subject editors can then review and edit the new or modified information without impeding the timeliness of entering new information. Entries are being reconstructured so that clinical disorder and gene information is divided into separate entries which will better represent the frequently complex relationship of gene mutations to individual clinical disorders in the data files. Further, each entry is being subdivided into logical topics which will enhance the power of electronic searching, making links between topics and improving readability. The old division of entries into autosomal dominant and recessive, etc., is being abandoned in favor of clinical disorder (phenotypes) and gene catalogs. The information is maintained in an SGML format which facilitates the production of many different types of output varying from the traditional book form to CD ROMs and various online formats including IRx, WAIS, Gopher and World Wide Web. This latter offers the exciting possibility of making hypertext links between entries and other data resources, including photographic, sound and video clips as part of the total MIM information.« less

  12. What is a Question?

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A given question can be defined in terms of the set of statements or assertions that answer it. Application of the logic of inference to this set of assertions allows one to derive the logic of inquiry among questions. There are interesting symmetries between the logics of inference and inquiry; where probability describes the degree to which a premise implies an assertion, there exists an analogous quantity that describes the bearing or relevance that a question has on an outstanding issue. These have been extended to suggest that the logic of inquiry results in functional relationships analogous to, although more general than, those found in information theory. Employing lattice theory, I examine in greater detail the structure of the space of assertions and questions demonstrating that the symmetries between the logical relations in each of the spaces derive directly from the lattice structure. Furthermore, I show that while symmetries between the spaces exist, the two lattices are not isomorphic. The lattice of assertions is described by a Boolean lattice 2(sup N) whereas the lattice of real questions is shown to be a sublattice of the free distributive lattice FD(N) = 2(sup 2(sup N)). Thus there does not exist a one-to-one mapping of assertions to questions, there is no reflection symmetry between the two spaces, and questions in general do not possess unique complements. Last, with these lattice structures in mind, I discuss the relationship between probability, relevance and entropy.

  13. Knowledge discovery for pancreatic cancer using inductive logic programming.

    PubMed

    Qiu, Yushan; Shimada, Kazuaki; Hiraoka, Nobuyoshi; Maeshiro, Kensei; Ching, Wai-Ki; Aoki-Kinoshita, Kiyoko F; Furuta, Koh

    2014-08-01

    Pancreatic cancer is a devastating disease and predicting the status of the patients becomes an important and urgent issue. The authors explore the applicability of inductive logic programming (ILP) method in the disease and show that the accumulated clinical laboratory data can be used to predict disease characteristics, and this will contribute to the selection of therapeutic modalities of pancreatic cancer. The availability of a large amount of clinical laboratory data provides clues to aid in the knowledge discovery of diseases. In predicting the differentiation of tumour and the status of lymph node metastasis in pancreatic cancer, using the ILP model, three rules are developed that are consistent with descriptions in the literature. The rules that are identified are useful to detect the differentiation of tumour and the status of lymph node metastasis in pancreatic cancer and therefore contributed significantly to the decision of therapeutic strategies. In addition, the proposed method is compared with the other typical classification techniques and the results further confirm the superiority and merit of the proposed method.

  14. Robust Fault Detection for Aircraft Using Mixed Structured Singular Value Theory and Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Collins, Emmanuel G.

    2000-01-01

    The purpose of fault detection is to identify when a fault or failure has occurred in a system such as an aircraft or expendable launch vehicle. The faults may occur in sensors, actuators, structural components, etc. One of the primary approaches to model-based fault detection relies on analytical redundancy. That is the output of a computer-based model (actually a state estimator) is compared with the sensor measurements of the actual system to determine when a fault has occurred. Unfortunately, the state estimator is based on an idealized mathematical description of the underlying plant that is never totally accurate. As a result of these modeling errors, false alarms can occur. This research uses mixed structured singular value theory, a relatively recent and powerful robustness analysis tool, to develop robust estimators and demonstrates the use of these estimators in fault detection. To allow qualitative human experience to be effectively incorporated into the detection process fuzzy logic is used to predict the seriousness of the fault that has occurred.

  15. Defaults, context, and knowledge: alternatives for OWL-indexed knowledge bases.

    PubMed

    Rector, A

    2004-01-01

    The new Web Ontology Language (OWL) and its Description Logic compatible sublanguage (OWL-DL) explicitly exclude defaults and exceptions, as do all logic based formalisms for ontologies. However, many biomedical applications appear to require default reasoning, at least if they are to be engineered in a maintainable way. Default reasoning has always been one of the great strengths of Frame systems such as Protégé. Resolving this conflict requires analysis of the different uses for defaults and exceptions. In some cases, alternatives can be provided within the OWL framework; in others, it appears that hybrid reasoning about a knowledge base of contingent facts built around the core ontology is necessary. Trade-offs include both human factors and the scaling of computational performance. The analysis presented here is based on the OpenGALEN experience with large scale ontologies using a formalism, GRAIL, which explicitly incorporates constructs for hybrid reasoning, numerous experiments with OWL, and initial work on combining OWL and Protégé.

  16. Evaluating Flight Crew Operator Manual Documentation

    NASA Technical Reports Server (NTRS)

    Sherry, Lance; Feary, Michael

    1998-01-01

    Aviation and cognitive science researchers have identified situations in which the pilot s expectations for the behavior of the avionics are not matched by the actual behavior of the avionics. Researchers have attributed these "automation surprises" to the complexity of the avionics mode logic, the absence of complete training, limitations in cockpit displays, and ad-hoc conceptual models of the avionics. Complete canonical rule-based descriptions of the behavior of the autopilot provide the basis for understanding the perceived complexity of the autopilots, the differences between the pilot s and autopilot s conceptual models, and the limitations in training materials and cockpit displays. This paper compares the behavior of the autopilot Vertical Speed/Flight Path Angle (VS-FPA) mode as described in the Flight Crew Operators Manual (FCOM) and the actual behavior of the VS-FPA mode defined in the autopilot software. This example demonstrates the use of the Operational Procedure Model (OPM) as a method for using the requirements specification for the design of the software logic as information requirements for training.

  17. DSGRN: Examining the Dynamics of Families of Logical Models.

    PubMed

    Cummins, Bree; Gedeon, Tomas; Harker, Shaun; Mischaikow, Konstantin

    2018-01-01

    We present a computational tool DSGRN for exploring the dynamics of a network by computing summaries of the dynamics of switching models compatible with the network across all parameters. The network can arise directly from a biological problem, or indirectly as the interaction graph of a Boolean model. This tool computes a finite decomposition of parameter space such that for each region, the state transition graph that describes the coarse dynamical behavior of a network is the same. Each of these parameter regions corresponds to a different logical description of the network dynamics. The comparison of dynamics across parameters with experimental data allows the rejection of parameter regimes or entire networks as viable models for representing the underlying regulatory mechanisms. This in turn allows a search through the space of perturbations of a given network for networks that robustly fit the data. These are the first steps toward discovering a network that optimally matches the observed dynamics by searching through the space of networks.

  18. Development of the automatic test pattern generation for NPP digital electronic circuits using the degree of freedom concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, D.S.; Seong, P.H.

    1995-08-01

    In this paper, an improved algorithm for automatic test pattern generation (ATG) for nuclear power plant digital electronic circuits--the combinational type of logic circuits is presented. For accelerating and improving the ATG process for combinational circuits the presented ATG algorithm has the new concept--the degree of freedom (DF). The DF, directly computed from the system descriptions such as types of gates and their interconnections, is the criterion to decide which among several alternate lines` logic values required along each path promises to be the most effective in order to accelerate and improve the ATG process. Based on the DF themore » proposed ATG algorithm is implemented in the automatic fault diagnosis system (AFDS) which incorporates the advanced fault diagnosis method of artificial intelligence technique, it is shown that the AFDS using the ATG algorithm makes Universal Card (UV Card) testing much faster than the present testing practice or by using exhaustive testing sets.« less

  19. RuleGO: a logical rules-based tool for description of gene groups by means of Gene Ontology

    PubMed Central

    Gruca, Aleksandra; Sikora, Marek; Polanski, Andrzej

    2011-01-01

    Genome-wide expression profiles obtained with the use of DNA microarray technology provide abundance of experimental data on biological and molecular processes. Such amount of data need to be further analyzed and interpreted in order to obtain biological conclusions on the basis of experimental results. The analysis requires a lot of experience and is usually time-consuming process. Thus, frequently various annotation databases are used to improve the whole process of analysis. Here, we present RuleGO—the web-based application that allows the user to describe gene groups on the basis of logical rules that include Gene Ontology (GO) terms in their premises. Presented application allows obtaining rules that reflect coappearance of GO-terms describing genes supported by the rules. The ontology level and number of coappearing GO-terms is adjusted in automatic manner. The user limits the space of possible solutions only. The RuleGO application is freely available at http://rulego.polsl.pl/. PMID:21715384

  20. Meeting psychosocial needs for persons with dementia in home care services - a qualitative study of different perceptions and practices among health care providers.

    PubMed

    Hansen, Anette; Hauge, Solveig; Bergland, Ådel

    2017-09-11

    The majority of persons with dementia are home-dwelling. To enable these persons to stay in their own homes as long as possible, a holistic, individual and flexible care is recommended. Despite a requirement for meeting psychological, social and physical needs, home care services seem to focus on patients' physical needs. Accordingly, the aim of this study was to explore how the psychosocial needs of home-dwelling, older persons with dementia were perceived, emphasized and met by home care services. A descriptive, qualitative approach was used. Data were collected through semi-structured focus group interviews with 24 health care providers in home care services from four municipalities. Data were analysed using systematic text condensation. This study showed major differences in how health care providers perceived the psychosocial needs of older home-dwelling persons with dementia and how they perceived their responsibilities for meeting those psychosocial needs. The differences in the health care providers' perceptions seemed to significantly influence the provided care. Three co-existing logics of care were identified: the physical need-oriented logic, the renouncement logic and the integrated logic. The differences in how health care providers perceived the psychosocial needs of persons with dementia and their responsibilities for meeting those needs, influenced how the psychosocial needs were met. These differences indicates a need for a clarification of how psychosocial needs should be conceptualized and who should be responsible for meeting these needs. Further, increased competence and increased consciousness of psychosocial needs and how those needs can be met, are essential for delivering high-quality holistic care that enables persons with dementia to live in their own home for as long as possible.

  1. Looped back fiber mode for reduction of false alarm in leak detection using distributed optical fiber sensor.

    PubMed

    Chelliah, Pandian; Murgesan, Kasinathan; Samvel, Sosamma; Chelamchala, Babu Rao; Tammana, Jayakumar; Nagarajan, Murali; Raj, Baldev

    2010-07-10

    Optical-fiber-based sensors have inherent advantages, such as immunity to electromagnetic interference, compared to the conventional sensors. Distributed optical fiber sensor (DOFS) systems, such as Raman and Brillouin distributed temperature sensors are used for leak detection. The inherent noise of fiber-based systems leads to occasional false alarms. In this paper, a methodology is proposed to overcome this. This uses a looped back fiber mode in DOFS and voting logic is employed to considerably reduce the false alarm rate.

  2. A new approach to telemetry data processing. Ph.D. Thesis - Maryland Univ.

    NASA Technical Reports Server (NTRS)

    Broglio, C. J.

    1973-01-01

    An approach for a preprocessing system for telemetry data processing was developed. The philosophy of the approach is the development of a preprocessing system to interface with the main processor and relieve it of the burden of stripping information from a telemetry data stream. To accomplish this task, a telemetry preprocessing language was developed. Also, a hardware device for implementing the operation of this language was designed using a cellular logic module concept. In the development of the hardware device and the cellular logic module, a distributed form of control was implemented. This is accomplished by a technique of one-to-one intermodule communications and a set of privileged communication operations. By transferring this control state from module to module, the control function is dispersed through the system. A compiler for translating the preprocessing language statements into an operations table for the hardware device was also developed. Finally, to complete the system design and verify it, a simulator for the collular logic module was written using the APL/360 system.

  3. Accessing files in an Internet: The Jade file system

    NASA Technical Reports Server (NTRS)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  4. Accessing files in an internet - The Jade file system

    NASA Technical Reports Server (NTRS)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  5. Defect tolerance in resistor-logic demultiplexers for nanoelectronics.

    PubMed

    Kuekes, Philip J; Robinett, Warren; Williams, R Stanley

    2006-05-28

    Since defect rates are expected to be high in nanocircuitry, we analyse the performance of resistor-based demultiplexers in the presence of defects. The defects observed to occur in fabricated nanoscale crossbars are stuck-open, stuck-closed, stuck-short, broken-wire, and adjacent-wire-short defects. We analyse the distribution of voltages on the nanowire output lines of a resistor-logic demultiplexer, based on an arbitrary constant-weight code, when defects occur. These analyses show that resistor-logic demultiplexers can tolerate small numbers of stuck-closed, stuck-open, and broken-wire defects on individual nanowires, at the cost of some degradation in the circuit's worst-case voltage margin. For stuck-short and adjacent-wire-short defects, and for nanowires with too many defects of the other types, the demultiplexer can still achieve error-free performance, but with a smaller set of output lines. This design thus has two layers of defect tolerance: the coding layer improves the yield of usable output lines, and an avoidance layer guarantees that error-free performance is achieved.

  6. Hardware implementation of fuzzy Petri net as a controller.

    PubMed

    Gniewek, Lesław; Kluska, Jacek

    2004-06-01

    The paper presents a new approach to fuzzy Petri net (FPN) and its hardware implementation. The authors' motivation is as follows. Complex industrial processes can be often decomposed into many parallelly working subprocesses, which can, in turn, be modeled using Petri nets. If all the process variables (or events) are assumed to be two-valued signals, then it is possible to obtain a hardware or software control device, which works according to the algorithm described by conventional Petri net. However, the values of real signals are contained in some bounded interval and can be interpreted as events which are not only true or false, but rather true in some degree from the interval [0, 1]. Such a natural interpretation from multivalued logic (fuzzy logic) point of view, concerns sensor outputs, control signals, time expiration, etc. It leads to the idea of FPN as a controller, which one can rather simply obtain, and which would be able to process both analog, and binary signals. In the paper both graphical, and algebraic representations of the proposed FPN are given. The conditions under which transitions can be fired are described. The algebraic description of the net and a theorem which enables computation of new marking in the net, based on current marking, are formulated. Hardware implementation of the FPN, which uses fuzzy JK flip-flops and fuzzy gates, are proposed. An example illustrating usefulness of the proposed FPN for control algorithm description and its synthesis as a controller device for the concrete production process are presented.

  7. The intrinsic periodic fluctuation of forest: a theoretical model based on diffusion equation

    NASA Astrophysics Data System (ADS)

    Zhou, J.; Lin, G., Sr.

    2015-12-01

    Most forest dynamic models predict the stable state of size structure as well as the total basal area and biomass in mature forest, the variation of forest stands are mainly driven by environmental factors after the equilibrium has been reached. However, although the predicted power-law size-frequency distribution does exist in analysis of many forest inventory data sets, the estimated distribution exponents are always shifting between -2 and -4, and has a positive correlation with the mean value of DBH. This regular pattern can not be explained by the effects of stochastic disturbances on forest stands. Here, we adopted the partial differential equation (PDE) approach to deduce the systematic behavior of an ideal forest, by solving the diffusion equation under the restricted condition of invariable resource occupation, a periodic solution was gotten to meet the variable performance of forest size structure while the former models with stable performance were just a special case of the periodic solution when the fluctuation frequency equals zero. In our results, the number of individuals in each size class was the function of individual growth rate(G), mortality(M), size(D) and time(T), by borrowing the conclusion of allometric theory on these parameters, the results perfectly reflected the observed "exponent-mean DBH" relationship and also gave a logically complete description to the time varying form of forest size-frequency distribution. Our model implies that the total biomass of a forest can never reach a stable equilibrium state even in the absence of disturbances and climate regime shift, we propose the idea of intrinsic fluctuation property of forest and hope to provide a new perspective on forest dynamics and carbon cycle research.

  8. New model for distributed multimedia databases and its application to networking of museums

    NASA Astrophysics Data System (ADS)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1998-02-01

    This paper proposes a new distributed multimedia data base system where the databases storing MPEG-2 videos and/or super high definition images are connected together through the B-ISDN's, and also refers to an example of the networking of museums on the basis of the proposed database system. The proposed database system introduces a new concept of the 'retrieval manager' which functions an intelligent controller so that the user can recognize a set of image databases as one logical database. A user terminal issues a request to retrieve contents to the retrieval manager which is located in the nearest place to the user terminal on the network. Then, the retrieved contents are directly sent through the B-ISDN's to the user terminal from the server which stores the designated contents. In this case, the designated logical data base dynamically generates the best combination of such a retrieving parameter as a data transfer path referring to directly or data on the basis of the environment of the system. The generated retrieving parameter is then executed to select the most suitable data transfer path on the network. Therefore, the best combination of these parameters fits to the distributed multimedia database system.

  9. Analysis and Modeling of U.S. Army Recruiting Markets

    DTIC Science & Technology

    2016-03-24

    31 4 Illustration of Stochastic Mean Value Imputation . . . . . . . . . . . . . . . . . . . 33 5 Unemployment Rate Using...9 Unemployment Rate, Not Seasonally Adjusted (Source: LAUS... sector . The missioning process results logically as re- cruiting leaders attempt to answer the question, “How does USAREC distribute its recruiting

  10. Best of Both Worlds Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Needham, Catherine

    2017-08-16

    This article builds on Mannion and Exworthy's account of the tensions between standardization and customization within health services to explore why these tensions exist. It highlights the limitations of explanations which root them in an expression of managerialism versus professionalism and suggests that each logic is embedded in a set of ontological, epistemological and moral commitments which are held in tension. At the front line of care delivery, people cannot resolve these tensions but must navigate and negotiate them. The legitimacy of a health system depends on its ability to deliver the 'best of both worlds' to citizens, offering the reassurance of sameness and the dignity of difference. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  11. Integrating labview into a distributed computing environment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasemir, K. U.; Pieck, M.; Dalesio, L. R.

    2001-01-01

    Being easy to learn and well suited for a selfcontained desktop laboratory setup, many casual programmers prefer to use the National Instruments Lab-VIEW environment to develop their logic. An ActiveX interface is presented that allows integration into a plant-wide distributed environment based on the Experimental Physics and Industrial Control System (EPICS). This paper discusses the design decisions and provides performance information, especially considering requirements for the Spallation Neutron Source (SNS) diagnostics system.

  12. Automated identification of protein-ligand interaction features using Inductive Logic Programming: a hexose binding case study

    PubMed Central

    2012-01-01

    Background There is a need for automated methods to learn general features of the interactions of a ligand class with its diverse set of protein receptors. An appropriate machine learning approach is Inductive Logic Programming (ILP), which automatically generates comprehensible rules in addition to prediction. The development of ILP systems which can learn rules of the complexity required for studies on protein structure remains a challenge. In this work we use a new ILP system, ProGolem, and demonstrate its performance on learning features of hexose-protein interactions. Results The rules induced by ProGolem detect interactions mediated by aromatics and by planar-polar residues, in addition to less common features such as the aromatic sandwich. The rules also reveal a previously unreported dependency for residues cys and leu. They also specify interactions involving aromatic and hydrogen bonding residues. This paper shows that Inductive Logic Programming implemented in ProGolem can derive rules giving structural features of protein/ligand interactions. Several of these rules are consistent with descriptions in the literature. Conclusions In addition to confirming literature results, ProGolem’s model has a 10-fold cross-validated predictive accuracy that is superior, at the 95% confidence level, to another ILP system previously used to study protein/hexose interactions and is comparable with state-of-the-art statistical learners. PMID:22783946

  13. Some logical functions of joint control.

    PubMed Central

    Lowenkron, B

    1998-01-01

    Constructing a behavioral account of the language-related performances that characterize responding to logical and symbolic relations between stimuli is commonly viewed as a problem for the area of stimulus control. In response to this problem, the notion of joint control is presented here, and its ability to provide an interpretative account of these kinds of performances is explored. Joint control occurs when the currently rehearsed topography of a verbal operant, as evoked by one stimulus, is simultaneously evoked by another stimulus. This event, the onset of joint stimulus control by two stimuli over a common response topography, then sets the occasion for a response appropriate to this special relation between the stimuli. Although the mechanism described is simple, it seems to have broad explanatory properties. In what follows, these properties are applied to provide a behavioral interpretation of two sorts of fundamental, putatively cognitive, performances: those based on logical relations and those based on semantic relations. The first includes responding to generalized conceptual relations such as identity, order, relative size, distance, and orientation. The second includes responding to relations usually ascribed to word meaning. These include relations between words and objects, the specification of objects by words, name-object bidirectionality, and the recognition of objects from their description. Finally, as a preview of some further possibilities, the role of joint control in goal-oriented behavior is considered briefly. PMID:9599452

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanchurin, Vitaly, E-mail: vvanchur@d.umn.edu

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly,more » CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.« less

  15. Using artificial intelligence to improve identification of nanofluid gas-liquid two-phase flow pattern in mini-channel

    NASA Astrophysics Data System (ADS)

    Xiao, Jian; Luo, Xiaoping; Feng, Zhenfei; Zhang, Jinxin

    2018-01-01

    This work combines fuzzy logic and a support vector machine (SVM) with a principal component analysis (PCA) to create an artificial-intelligence system that identifies nanofluid gas-liquid two-phase flow states in a vertical mini-channel. Flow-pattern recognition requires finding the operational details of the process and doing computer simulations and image processing can be used to automate the description of flow patterns in nanofluid gas-liquid two-phase flow. This work uses fuzzy logic and a SVM with PCA to improve the accuracy with which the flow pattern of a nanofluid gas-liquid two-phase flow is identified. To acquire images of nanofluid gas-liquid two-phase flow patterns of flow boiling, a high-speed digital camera was used to record four different types of flow-pattern images, namely annular flow, bubbly flow, churn flow, and slug flow. The textural features extracted by processing the images of nanofluid gas-liquid two-phase flow patterns are used as inputs to various identification schemes such as fuzzy logic, SVM, and SVM with PCA to identify the type of flow pattern. The results indicate that the SVM with reduced characteristics of PCA provides the best identification accuracy and requires less calculation time than the other two schemes. The data reported herein should be very useful for the design and operation of industrial applications.

  16. Dc microgrid stabilization through fuzzy control of interleaved, heterogeneous storage elements

    NASA Astrophysics Data System (ADS)

    Smith, Robert David

    As microgrid power systems gain prevalence and renewable energy comprises greater and greater portions of distributed generation, energy storage becomes important to offset the higher variance of renewable energy sources and maximize their usefulness. One of the emerging techniques is to utilize a combination of lead-acid batteries and ultracapacitors to provide both short and long-term stabilization to microgrid systems. The different energy and power characteristics of batteries and ultracapacitors imply that they ought to be utilized in different ways. Traditional linear controls can use these energy storage systems to stabilize a power grid, but cannot effect more complex interactions. This research explores a fuzzy logic approach to microgrid stabilization. The ability of a fuzzy logic controller to regulate a dc bus in the presence of source and load fluctuations, in a manner comparable to traditional linear control systems, is explored and demonstrated. Furthermore, the expanded capabilities (such as storage balancing, self-protection, and battery optimization) of a fuzzy logic system over a traditional linear control system are shown. System simulation results are presented and validated through hardware-based experiments. These experiments confirm the capabilities of the fuzzy logic control system to regulate bus voltage, balance storage elements, optimize battery usage, and effect self-protection.

  17. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    PubMed Central

    Lo, Benjamin W. Y.; Macdonald, R. Loch; Baker, Andrew; Levine, Mitchell A. H.

    2013-01-01

    Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH). Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients). Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs). Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique) denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication. PMID:23690884

  18. Online energy management strategy of fuel cell hybrid electric vehicles based on data fusion approach

    NASA Astrophysics Data System (ADS)

    Zhou, Daming; Al-Durra, Ahmed; Gao, Fei; Ravey, Alexandre; Matraji, Imad; Godoy Simões, Marcelo

    2017-10-01

    Energy management strategy plays a key role for Fuel Cell Hybrid Electric Vehicles (FCHEVs), it directly affects the efficiency and performance of energy storages in FCHEVs. For example, by using a suitable energy distribution controller, the fuel cell system can be maintained in a high efficiency region and thus saving hydrogen consumption. In this paper, an energy management strategy for online driving cycles is proposed based on a combination of the parameters from three offline optimized fuzzy logic controllers using data fusion approach. The fuzzy logic controllers are respectively optimized for three typical driving scenarios: highway, suburban and city in offline. To classify patterns of online driving cycles, a Probabilistic Support Vector Machine (PSVM) is used to provide probabilistic classification results. Based on the classification results of the online driving cycle, the parameters of each offline optimized fuzzy logic controllers are then fused using Dempster-Shafer (DS) evidence theory, in order to calculate the final parameters for the online fuzzy logic controller. Three experimental validations using Hardware-In-the-Loop (HIL) platform with different-sized FCHEVs have been performed. Experimental comparison results show that, the proposed PSVM-DS based online controller can achieve a relatively stable operation and a higher efficiency of fuel cell system in real driving cycles.

  19. The Use of a Predictive Habitat Model and a Fuzzy Logic Approach for Marine Management and Planning

    PubMed Central

    Hattab, Tarek; Ben Rais Lasram, Frida; Albouy, Camille; Sammari, Chérif; Romdhane, Mohamed Salah; Cury, Philippe; Leprieur, Fabien; Le Loc’h, François

    2013-01-01

    Bottom trawl survey data are commonly used as a sampling technique to assess the spatial distribution of commercial species. However, this sampling technique does not always correctly detect a species even when it is present, and this can create significant limitations when fitting species distribution models. In this study, we aim to test the relevance of a mixed methodological approach that combines presence-only and presence-absence distribution models. We illustrate this approach using bottom trawl survey data to model the spatial distributions of 27 commercially targeted marine species. We use an environmentally- and geographically-weighted method to simulate pseudo-absence data. The species distributions are modelled using regression kriging, a technique that explicitly incorporates spatial dependence into predictions. Model outputs are then used to identify areas that met the conservation targets for the deployment of artificial anti-trawling reefs. To achieve this, we propose the use of a fuzzy logic framework that accounts for the uncertainty associated with different model predictions. For each species, the predictive accuracy of the model is classified as ‘high’. A better result is observed when a large number of occurrences are used to develop the model. The map resulting from the fuzzy overlay shows that three main areas have a high level of agreement with the conservation criteria. These results align with expert opinion, confirming the relevance of the proposed methodology in this study. PMID:24146867

  20. 119. VIEW OF NORTH SIDE OF LANDLINE INSTRUMENTATION ROOM (206), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    119. VIEW OF NORTH SIDE OF LANDLINE INSTRUMENTATION ROOM (206), LSB (BLDG. 751). POWER DISTRIBUTION UNITS AND CABLE DISTRIBUTION UNITS ON RIGHT SIDE OF PHOTO; LOGIC CONTROL AND MONITOR UNITS FOR BOOSTER AND FUEL SYSTEMS LEFT OF AND PARALLEL TO EAST ROW OF CABINETS; SIGNAL CONDITIONERS AT NORTH END OF ROOM PERPENDICULAR TO OTHER CABINETS. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  1. Welding Metallurgy and Weldability of Stainless Steels

    NASA Astrophysics Data System (ADS)

    Lippold, John C.; Kotecki, Damian J.

    2005-03-01

    Welding Metallurgy and Weldability of Stainless Steels, the first book in over twenty years to address welding metallurgy and weldability issues associated with stainless steel, offers the most up-to-date and comprehensive treatment of these topics currently available. The authors emphasize fundamental metallurgical principles governing microstructure evolution and property development of stainless steels, including martensistic, ferric, austenitic, duplex, and precipitation hardening grades. They present a logical and well-organized look at the history, evolution, and primary uses of each stainless steel, including detailed descriptions of the associated weldability issues.

  2. VHDL simulation with access to transistor models

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  3. System description: IVY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCune, W.; Shumsky, O.

    2000-02-04

    IVY is a verified theorem prover for first-order logic with equality. It is coded in ACL2, and it makes calls to the theorem prover Otter to search for proofs and to the program MACE to search for countermodels. Verifications of Otter and MACE are not practical because they are coded in C. Instead, Otter and MACE give detailed proofs and models that are checked by verified ACL2 programs. In addition, the initial conversion to clause form is done by verified ACL2 code. The verification is done with respect to finite interpretations.

  4. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  5. HP-9825A HFRMP trajectory processor (#TRAJ), detailed description. [relative motion of the space shuttle orbiter and a free-flying payload

    NASA Technical Reports Server (NTRS)

    Kindall, S. M.

    1980-01-01

    The computer code for the trajectory processor (#TRAJ) of the high fidelity relative motion program is described. The #TRAJ processor is a 12-degrees-of-freedom trajectory integrator (6 degrees of freedom for each of two vehicles) which can be used to generate digital and graphical data describing the relative motion of the Space Shuttle Orbiter and a free-flying cylindrical payload. A listing of the code, coding standards and conventions, detailed flow charts, and discussions of the computational logic are included.

  6. Tensor Arithmetic, Geometric and Mathematic Principles of Fluid Mechanics in Implementation of Direct Computational Experiments

    NASA Astrophysics Data System (ADS)

    Bogdanov, Alexander; Khramushin, Vasily

    2016-02-01

    The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.

  7. Neural mechanisms of the mind, Aristotle, Zadeh, and fMRI.

    PubMed

    Perlovsky, Leonid I

    2010-05-01

    Processes in the mind: perception, cognition, concepts, instincts, emotions, and higher cognitive abilities for abstract thinking, beautiful music are considered here within a neural modeling fields (NMFs) paradigm. Its fundamental mathematical mechanism is a process "from vague-fuzzy to crisp," called dynamic logic (DL). This paper discusses why this paradigm is necessary mathematically, and relates it to a psychological description of the mind. Surprisingly, the process from "vague to crisp" corresponds to Aristotelian understanding of mental functioning. Recent functional magnetic resonance imaging (fMRI) measurements confirmed this process in neural mechanisms of perception.

  8. Closed Loop Real-Time Evaluation of Missile Guidance and Control Components: Simulator Hardware/Software Characteristics and Use

    DTIC Science & Technology

    1974-08-01

    Node Control Logic 2-27 2.16 Pitch Channel Frequence Response 2-36 2.17 Yaw Channel Frequency Response 2-37 K 4 2.18 Analog Computer Mechanlzation of...8217S 0 121 £l1:c IL-I. TABLE I Elements of the Slgma 5 Digital Computer System Xerox Model- Performance MIOP Channel Description Number Characteristics...transfer control signals to or from the CPU. The MIOP can handle up to 32 I/0 channels each operating simultaneously, provided the overall data

  9. The Conceptualization of Value in the Value Proposition of New Health Technologies Comment on "Providing Value to New Health Technology: The Early Contribution of Entrepreneurs, Investors, and Regulatory Agencies".

    PubMed

    Buttigieg, Sandra C; Hoof, Joost van

    2017-07-03

    Lehoux et al provide a highly valid contribution in conceptualizing value in value propositions for new health technologies and developing an analytic framework that illustrates the interplay between health innovation supply-side logic (the logic of emergence) and demand-side logic (embedding in the healthcare system). This commentary brings forth several considerations on this article. First, a detailed stakeholder analysis provides the necessary premonition of potential hurdles in the development, implementation and dissemination of a new technology. This can be achieved by categorizing potential stakeholder groups on the basis of the potential impact of future technology. Secondly, the conceptualization of value in value propositions of new technologies should not only embrace business/economic and clinical values but also ethical, professional and cultural values, as well as factoring in the notion of usability and acceptance of new technology. As a final note, the commentary emphasises the point that technology should facilitate delivery of care without negatively affecting doctor-patient communications, physical examination skills, and development of clinical knowledge. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  10. Providers and Patients Caught Between Standardization and Individualization: Individualized Standardization as a Solution Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Ansmann, Lena; Pfaff, Holger

    2017-08-12

    In their 2017 article, Mannion and Exworthy provide a thoughtful and theory-based analysis of two parallel trends in modern healthcare systems and their competing and conflicting logics: standardization and customization. This commentary further discusses the challenge of treatment decision-making in times of evidence-based medicine (EBM), shared decision-making and personalized medicine. From the perspective of systems theory, we propose the concept of individualized standardization as a solution to the problem. According to this concept, standardization is conceptualized as a guiding framework leaving room for individualization in the patient physician interaction. The theoretical background is the concept of context management according to systems theory. Moreover, the comment suggests multidisciplinary teams as a possible solution for the integration of standardization and individualization, using the example of multidisciplinary tumor conferences and highlighting its limitations. The comment also supports the authors' statement of the patient as co-producer and introduces the idea that the competing logics of standardization and individualization are a matter of perspective on macro, meso and micro levels. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  11. Linear Temporal Logic (LTL) Based Monitoring of Smart Manufacturing Systems.

    PubMed

    Heddy, Gerald; Huzaifa, Umer; Beling, Peter; Haimes, Yacov; Marvel, Jeremy; Weiss, Brian; LaViers, Amy

    2015-01-01

    The vision of Smart Manufacturing Systems (SMS) includes collaborative robots that can adapt to a range of scenarios. This vision requires a classification of multiple system behaviors, or sequences of movement, that can achieve the same high-level tasks. Likewise, this vision presents unique challenges regarding the management of environmental variables in concert with discrete, logic-based programming. Overcoming these challenges requires targeted performance and health monitoring of both the logical controller and the physical components of the robotic system. Prognostics and health management (PHM) defines a field of techniques and methods that enable condition-monitoring, diagnostics, and prognostics of physical elements, functional processes, overall systems, etc. PHM is warranted in this effort given that the controller is vulnerable to program changes, which propagate in unexpected ways, logical runtime exceptions, sensor failure, and even bit rot. The physical component's health is affected by the wear and tear experienced by machines constantly in motion. The controller's source of faults is inherently discrete, while the latter occurs in a manner that builds up continuously over time. Such a disconnect poses unique challenges for PHM. This paper presents a robotic monitoring system that captures and resolves this disconnect. This effort leverages supervisory robotic control and model checking with linear temporal logic (LTL), presenting them as a novel monitoring system for PHM. This methodology has been demonstrated in a MATLAB-based simulator for an industry inspired use-case in the context of PHM. Future work will use the methodology to develop adaptive, intelligent control strategies to evenly distribute wear on the joints of the robotic arms, maximizing the life of the system.

  12. Determinants of Standard Errors of MLEs in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Cheng, Ying; Zhang, Wei

    2010-01-01

    This paper studies changes of standard errors (SE) of the normal-distribution-based maximum likelihood estimates (MLE) for confirmatory factor models as model parameters vary. Using logical analysis, simplified formulas and numerical verification, monotonic relationships between SEs and factor loadings as well as unique variances are found.…

  13. Schemas in Problem Solving: An Integrated Model of Learning, Memory, and Instruction

    DTIC Science & Technology

    1992-01-01

    article: "Hybrid Computation in Cognitive Science: Neural Networks and Symbols" (J. A. Anderson, 1990). And, Marvin Minsky echoes the sentiment in his...distributed processing: A handbook of models, programs, and exercises. Cambridge, MA: The MIT Press. Minsky , M. (1991). Logical versus analogical or symbolic

  14. Design, Specification, and Synthesis of Aircraft Electric Power Systems Control Logic

    NASA Astrophysics Data System (ADS)

    Xu, Huan

    Cyber-physical systems integrate computation, networking, and physical processes. Substantial research challenges exist in the design and verification of such large-scale, distributed sensing, actuation, and control systems. Rapidly improving technology and recent advances in control theory, networked systems, and computer science give us the opportunity to drastically improve our approach to integrated flow of information and cooperative behavior. Current systems rely on text-based specifications and manual design. Using new technology advances, we can create easier, more efficient, and cheaper ways of developing these control systems. This thesis will focus on design considerations for system topologies, ways to formally and automatically specify requirements, and methods to synthesize reactive control protocols, all within the context of an aircraft electric power system as a representative application area. This thesis consists of three complementary parts: synthesis, specification, and design. The first section focuses on the synthesis of central and distributed reactive controllers for an aircraft elec- tric power system. This approach incorporates methodologies from computer science and control. The resulting controllers are correct by construction with respect to system requirements, which are formulated using the specification language of linear temporal logic (LTL). The second section addresses how to formally specify requirements and introduces a domain-specific language for electric power systems. A software tool automatically converts high-level requirements into LTL and synthesizes a controller. The final sections focus on design space exploration. A design methodology is proposed that uses mixed-integer linear programming to obtain candidate topologies, which are then used to synthesize controllers. The discrete-time control logic is then verified in real-time by two methods: hardware and simulation. Finally, the problem of partial observability and dynamic state estimation is explored. Given a set placement of sensors on an electric power system, measurements from these sensors can be used in conjunction with control logic to infer the state of the system.

  15. The relationships between spatial ability, logical thinking, mathematics performance and kinematics graph interpretation skills of 12th grade physics students

    NASA Astrophysics Data System (ADS)

    Bektasli, Behzat

    Graphs have a broad use in science classrooms, especially in physics. In physics, kinematics is probably the topic for which graphs are most widely used. The participants in this study were from two different grade-12 physics classrooms, advanced placement and calculus-based physics. The main purpose of this study was to search for the relationships between student spatial ability, logical thinking, mathematical achievement, and kinematics graphs interpretation skills. The Purdue Spatial Visualization Test, the Middle Grades Integrated Process Skills Test (MIPT), and the Test of Understanding Graphs in Kinematics (TUG-K) were used for quantitative data collection. Classroom observations were made to acquire ideas about classroom environment and instructional techniques. Factor analysis, simple linear correlation, multiple linear regression, and descriptive statistics were used to analyze the quantitative data. Each instrument has two principal components. The selection and calculation of the slope and of the area were the two principal components of TUG-K. MIPT was composed of a component based upon processing text and a second component based upon processing symbolic information. The Purdue Spatial Visualization Test was composed of a component based upon one-step processing and a second component based upon two-step processing of information. Student ability to determine the slope in a kinematics graph was significantly correlated with spatial ability, logical thinking, and mathematics aptitude and achievement. However, student ability to determine the area in a kinematics graph was only significantly correlated with student pre-calculus semester 2 grades. Male students performed significantly better than female students on the slope items of TUG-K. Also, male students performed significantly better than female students on the PSAT mathematics assessment and spatial ability. This study found that students have different levels of spatial ability, logical thinking, and mathematics aptitude and achievement levels. These different levels were related to student learning of kinematics and they need to be considered when kinematics is being taught. It might be easier for students to understand the kinematics graphs if curriculum developers include more activities related to spatial ability and logical thinking.

  16. Nonlinear gyrokinetics: a powerful tool for the description of microturbulence in magnetized plasmas

    NASA Astrophysics Data System (ADS)

    Krommes, John A.

    2010-12-01

    Gyrokinetics is the description of low-frequency dynamics in magnetized plasmas. In magnetic-confinement fusion, it provides the most fundamental basis for numerical simulations of microturbulence; there are astrophysical applications as well. In this tutorial, a sketch of the derivation of the novel dynamical system comprising the nonlinear gyrokinetic (GK) equation (GKE) and the coupled electrostatic GK Poisson equation will be given by using modern Lagrangian and Lie perturbation methods. No background in plasma physics is required in order to appreciate the logical development. The GKE describes the evolution of an ensemble of gyrocenters moving in a weakly inhomogeneous background magnetic field and in the presence of electromagnetic perturbations with wavelength of the order of the ion gyroradius. Gyrocenters move with effective drifts, which may be obtained by an averaging procedure that systematically, order by order, removes gyrophase dependence. To that end, the use of the Lagrangian differential one-form as well as the content and advantages of Lie perturbation theory will be explained. The electromagnetic fields follow via Maxwell's equations from the charge and current density of the particles. Particle and gyrocenter densities differ by an important polarization effect. That is calculated formally by a 'pull-back' (a concept from differential geometry) of the gyrocenter distribution to the laboratory coordinate system. A natural truncation then leads to the closed GK dynamical system. Important properties such as GK energy conservation and fluctuation noise will be mentioned briefly, as will the possibility (and difficulties) of deriving nonlinear gyrofluid equations suitable for rapid numerical solution—although it is probably best to directly simulate the GKE. By the end of the tutorial, students should appreciate the GKE as an extremely powerful tool and will be prepared for later lectures describing its applications to physical problems.

  17. 98. VIEW OF NORTH SIDE OF LANDLINE INSTRUMENTATION ROOM (106), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    98. VIEW OF NORTH SIDE OF LANDLINE INSTRUMENTATION ROOM (106), LSB (BLDG. 770). POWER DISTRIBUTION UNITS AND CABLE DISTRIBUTION UNITS IN EAST ROW OF CABINETS; LOGIC CONTROL AND MONITOR UNITS FOR BOOSTER AND FUEL SYSTEMS, AND SIGNAL CONDITIONERS IN WEST ROW OF CABINETS. CABLE TRAY TUNNEL ENTRANCE TO LSB (BLDG. 770) AT THE SOUTH END OF LANDLINE INSTRUMENTATION ROOM (106). - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 West, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  18. Investigation of the Physiological Responses of Belugas to Stressors to Aid in Assessing the Impact of Environmental and Anthropogenic Challenges on Health

    DTIC Science & Technology

    2013-12-19

    Physiological Responses of Belugas to "Stressors" to Aid in Assessing the Impact of Environmental and Anthropogenic Challenges on Health 5a. CONTRACT...ANSI Std.Z39.18 " DISTRIBUTION STATEMENT A. Approved for public release: distribution is unlimited. Investigation of the Physiological Responses... physiological i.e. neuroimmunoendocrino logical responses of beluga whales to "Stressors". "Stressor events" will allow for a better understanding and

  19. Dielectric elastomer memory

    NASA Astrophysics Data System (ADS)

    O'Brien, Benjamin M.; McKay, Thomas G.; Xie, Sheng Q.; Calius, Emilio P.; Anderson, Iain A.

    2011-04-01

    Life shows us that the distribution of intelligence throughout flexible muscular networks is a highly successful solution to a wide range of challenges, for example: human hearts, octopi, or even starfish. Recreating this success in engineered systems requires soft actuator technologies with embedded sensing and intelligence. Dielectric Elastomer Actuator(s) (DEA) are promising due to their large stresses and strains, as well as quiet flexible multimodal operation. Recently dielectric elastomer devices were presented with built in sensor, driver, and logic capability enabled by a new concept called the Dielectric Elastomer Switch(es) (DES). DES use electrode piezoresistivity to control the charge on DEA and enable the distribution of intelligence throughout a DEA device. In this paper we advance the capabilities of DES further to form volatile memory elements. A set reset flip-flop with inverted reset line was developed based on DES and DEA. With a 3200V supply the flip-flop behaved appropriately and demonstrated the creation of dielectric elastomer memory capable of changing state in response to 1 second long set and reset pulses. This memory opens up applications such as oscillator, de-bounce, timing, and sequential logic circuits; all of which could be distributed throughout biomimetic actuator arrays. Future work will include miniaturisation to improve response speed, implementation into more complex circuits, and investigation of longer lasting and more sensitive switching materials.

  20. A novel way of integrating rule-based knowledge into a web ontology language framework.

    PubMed

    Gamberger, Dragan; Krstaçić, Goran; Jović, Alan

    2013-01-01

    Web ontology language (OWL), used in combination with the Protégé visual interface, is a modern standard for development and maintenance of ontologies and a powerful tool for knowledge presentation. In this work, we describe a novel possibility to use OWL also for the conceptualization of knowledge presented by a set of rules. In this approach, rules are represented as a hierarchy of actionable classes with necessary and sufficient conditions defined by the description logic formalism. The advantages are that: the set of the rules is not an unordered set anymore, the concepts defined in descriptive ontologies can be used directly in the bodies of rules, and Protégé presents an intuitive tool for editing the set of rules. Standard ontology reasoning processes are not applicable in this framework, but experiments conducted on the rule sets have demonstrated that the reasoning problems can be successfully solved.

  1. SKYMAP system description: Star catalog data base generation and utilization

    NASA Technical Reports Server (NTRS)

    Gottlieb, D. M.

    1979-01-01

    The specifications, design, software description, and use of the SKYMAP star catalog system are detailed. The SKYMAP system was developed to provide an accurate and complete catalog of all stars with blue or visual magnitudes brighter than 9.0 for use by attitude determination programs. Because of the large number of stars which are brighter than 9.0 magnitude, efficient techniques of manipulating and accessing the data were required. These techniques of staged distillation of data from a Master Catalog to a Core Catalog, and direct access of overlapping zone catalogs, form the basis of the SKYMAP system. The collection and tranformation of data required to produce the Master Catalog data base is described. The data flow through the main programs and levels of star catalogs is detailed. The mathematical and logical techniques for each program and the format of all catalogs are documented.

  2. [Critical thinking skills in the nursing diagnosis process].

    PubMed

    Bittencourt, Greicy Kelly Gouveia Dias; Crossetti, Maria da Graça Oliveira

    2013-04-01

    The aim of this study was to identify the critical thinking skills utilized in the nursing diagnosis process. This was an exploratory descriptive study conducted with seven nursing students on the application of a clinical case to identify critical thinking skills, as well as their justifications in the nursing diagnosis process. Content analysis was performed to evaluate descriptive data. Six participants reported that analysis, scientific and technical knowledge and logical reasoning skills are important in identifying priority nursing diagnoses; clinical experience was cited by five participants, knowledge about the patient and application of standards were mentioned by three participants; Furthermore, discernment and contextual perspective were skills noted by two participants. Based on these results, the use of critical thinking skills related to the steps of the nursing diagnosis process was observed. Therefore, that the application of this process may constitute a strategy that enables the development of critical thinking skills.

  3. Temporal abstraction for the analysis of intensive care information

    NASA Astrophysics Data System (ADS)

    Hadad, Alejandro J.; Evin, Diego A.; Drozdowicz, Bartolomé; Chiotti, Omar

    2007-11-01

    This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations.

  4. Printed wiring board system programmer's manual

    NASA Technical Reports Server (NTRS)

    Brinkerhoff, C. D.

    1973-01-01

    The printed wiring board system provides automated techniques for the design of printed circuit boards and hybrid circuit boards. The system consists of four programs: (1) the preprocessor program combines user supplied data and pre-defined library data to produce the detailed circuit description data; (2) the placement program assigns circuit components to specific areas of the board in a manner that optimizes the total interconnection length of the circuit; (3) the organizer program assigns pin interconnections to specific board levels and determines the optimal order in which the router program should attempt to layout the paths connecting the pins; and (4) the router program determines the wire paths which are to be used to connect each input pin pair on the circuit board. This document is intended to serve as a programmer's reference manual for the printed wiring board system. A detailed description of the internal logic and flow of the printed wiring board programs is included.

  5. Trimming a hazard logic tree with a new model-order-reduction technique

    USGS Publications Warehouse

    Porter, Keith; Field, Edward; Milner, Kevin R

    2017-01-01

    The size of the logic tree within the Uniform California Earthquake Rupture Forecast Version 3, Time-Dependent (UCERF3-TD) model can challenge risk analyses of large portfolios. An insurer or catastrophe risk modeler concerned with losses to a California portfolio might have to evaluate a portfolio 57,600 times to estimate risk in light of the hazard possibility space. Which branches of the logic tree matter most, and which can one ignore? We employed two model-order-reduction techniques to simplify the model. We sought a subset of parameters that must vary, and the specific fixed values for the remaining parameters, to produce approximately the same loss distribution as the original model. The techniques are (1) a tornado-diagram approach we employed previously for UCERF2, and (2) an apparently novel probabilistic sensitivity approach that seems better suited to functions of nominal random variables. The new approach produces a reduced-order model with only 60 of the original 57,600 leaves. One can use the results to reduce computational effort in loss analyses by orders of magnitude.

  6. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  7. Quantum key distribution without the wavefunction

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.

  8. First CLIPS Conference Proceedings, volume 2

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The topics of volume 2 of First CLIPS Conference are associated with following applications: quality control; intelligent data bases and networks; Space Station Freedom; Space Shuttle and satellite; user interface; artificial neural systems and fuzzy logic; parallel and distributed processing; enchancements to CLIPS; aerospace; simulation and defense; advisory systems and tutors; and intelligent control.

  9. Spatial patterns of serial murder: an analysis of disposal site location choice.

    PubMed

    Lundrigan, S; Canter, D

    2001-01-01

    Although the murders committed by serial killers may not be considered rational, there is growing evidence that the locations in which they commit their crimes may be guided by an implicit, if limited rationality. The hypothesized logic of disposal site choice of serial killers led to predictions that (a) their criminal domains would be around their home base and relate to familiar travel distances, (b) they would have a size that was characteristic of each offender, (c) the distribution would be biased towards other non-criminal activities, and (d) the size of the domains would increase over time. Examination of the geographical distribution of the sites at which 126 US and 29 UK serial killers disposed of their victims' bodies supported all four hypotheses. It was found that rational choice and routine activity models of criminal behavior could explain the spatial choices of serial murderers. It was concluded that the locations at which serial killers dispose of their victims' bodies reflect the inherent logic of the choices that underlie their predatory activities. Copyright 2001 John Wiley & Sons, Ltd.

  10. An Investigation of Quantum Dot Super Lattice Use in Nonvolatile Memory and Transistors

    NASA Astrophysics Data System (ADS)

    Mirdha, P.; Parthasarathy, B.; Kondo, J.; Chan, P.-Y.; Heller, E.; Jain, F. C.

    2018-02-01

    Site-specific self-assembled colloidal quantum dots (QDs) will deposit in two layers only on p-type substrate to form a QD superlattice (QDSL). The QDSL structure has been integrated into the floating gate of a nonvolatile memory component and has demonstrated promising results in multi-bit storage, ease of fabrication, and memory retention. Additionally, multi-valued logic devices and circuits have been created by using QDSL structures which demonstrated ternary and quaternary logic. With increasing use of site-specific self-assembled QDSLs, fundamental understanding of silicon and germanium QDSL charge storage capability, self-assembly on specific surfaces, uniform distribution, and mini-band formation has to be understood for successful implementation in devices. In this work, we investigate the differences in electron charge storage by building metal-oxide semiconductor (MOS) capacitors and using capacitance and voltage measurements to quantify the storage capabilities. The self-assembly process and distribution density of the QDSL is done by obtaining atomic force microscopy (AFM) results on line samples. Additionally, we present a summary of the theoretical density of states in each of the QDSLs.

  11. Evaluating data distribution and drift vulnerabilities of machine learning algorithms in secure and adversarial environments

    NASA Astrophysics Data System (ADS)

    Nelson, Kevin; Corbin, George; Blowers, Misty

    2014-05-01

    Machine learning is continuing to gain popularity due to its ability to solve problems that are difficult to model using conventional computer programming logic. Much of the current and past work has focused on algorithm development, data processing, and optimization. Lately, a subset of research has emerged which explores issues related to security. This research is gaining traction as systems employing these methods are being applied to both secure and adversarial environments. One of machine learning's biggest benefits, its data-driven versus logic-driven approach, is also a weakness if the data on which the models rely are corrupted. Adversaries could maliciously influence systems which address drift and data distribution changes using re-training and online learning. Our work is focused on exploring the resilience of various machine learning algorithms to these data-driven attacks. In this paper, we present our initial findings using Monte Carlo simulations, and statistical analysis, to explore the maximal achievable shift to a classification model, as well as the required amount of control over the data.

  12. On the impact of `smart tyres' on existing ABS/EBD control systems

    NASA Astrophysics Data System (ADS)

    Cheli, Federico; Leo, Elisbetta; Melzi, Stefano; Sabbioni, Edoardo

    2010-12-01

    The paper focuses on the possibility of enhancing the performances of the ABS (Antilock Braking System)/EBD (electronic braking distribution) control system by using the additional information provided by 'smart tyres' (i.e. tyres with embedded sensors and digital-computing capability). In particular, on the basis of previous works [Braghin et al., Future car active controls through the measurement of contact forces and patch features, Veh. Syst. Dyn. 44 (2006), pp. 3-13], the authors assumed that these components should be able to provide estimates for the normal loads acting on the four wheels and for the tyre-road friction coefficient. The benefits produced by the introduction of these additional channels into the existing ABS/EBD control logic were evaluated through simulations carried out with a validated 14 degrees of freedom (dofs) vehicle + ABS/EBD control logic numerical model. The performance of the ABS control system was evaluated through a series of braking manoeuvres on straight track focusing the attention on μ -jump conditions, while the performance of the EBD control system was assessed by means of braking manoeuvres carried out considering several weight distributions.

  13. Histological Image Processing Features Induce a Quantitative Characterization of Chronic Tumor Hypoxia

    PubMed Central

    Grabocka, Elda; Bar-Sagi, Dafna; Mishra, Bud

    2016-01-01

    Hypoxia in tumors signifies resistance to therapy. Despite a wealth of tumor histology data, including anti-pimonidazole staining, no current methods use these data to induce a quantitative characterization of chronic tumor hypoxia in time and space. We use image-processing algorithms to develop a set of candidate image features that can formulate just such a quantitative description of xenographed colorectal chronic tumor hypoxia. Two features in particular give low-variance measures of chronic hypoxia near a vessel: intensity sampling that extends radially away from approximated blood vessel centroids, and multithresholding to segment tumor tissue into normal, hypoxic, and necrotic regions. From these features we derive a spatiotemporal logical expression whose truth value depends on its predicate clauses that are grounded in this histological evidence. As an alternative to the spatiotemporal logical formulation, we also propose a way to formulate a linear regression function that uses all of the image features to learn what chronic hypoxia looks like, and then gives a quantitative similarity score once it is trained on a set of histology images. PMID:27093539

  14. FALCON: a toolbox for the fast contextualization of logical networks

    PubMed Central

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-01-01

    Abstract Motivation Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. Results We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. Availability and implementation FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. Contact thomas.sauter@uni.lu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28673016

  15. FALCON: a toolbox for the fast contextualization of logical networks.

    PubMed

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-11-01

    Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. thomas.sauter@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Optical pattern recognition algorithms on neural-logic equivalent models and demonstration of their prospects and possible implementations

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Zaitsev, Alexandr V.; Voloshin, Victor M.

    2001-03-01

    Historic information regarding the appearance and creation of fundamentals of algebra-logical apparatus-`equivalental algebra' for description of neuro-nets paradigms and algorithms is considered which is unification of theory of neuron nets (NN), linear algebra and the most generalized neuro-biology extended for matrix case. A survey is given of `equivalental models' of neuron nets and associative memory is suggested new, modified matrix-tenzor neurological equivalental models (MTNLEMS) are offered with double adaptive-equivalental weighing (DAEW) for spatial-non- invariant recognition (SNIR) and space-invariant recognition (SIR) of 2D images (patterns). It is shown, that MTNLEMS DAEW are the most generalized, they can describe the processes in NN both within the frames of known paradigms and within new `equivalental' paradigm of non-interaction type, and the computing process in NN under using the offered MTNLEMs DAEW is reduced to two-step and multi-step algorithms and step-by-step matrix-tenzor procedures (for SNIR) and procedures of defining of space-dependent equivalental functions from two images (for SIR).

  17. Challenges And Results of the Applications of Fuzzy Logic in the Classification of Rich Galaxy Clusters

    NASA Astrophysics Data System (ADS)

    Girola Schneider, R.

    2017-07-01

    The fuzzy logic is a branch of the artificial intelligence founded on the concept that everything is a matter of degree. It intends to create mathematical approximations on the resolution of certain types of problems. In addition, it aims to produce exact results obtained from imprecise data, for which it is particularly useful for electronic and computer applications. This enables it to handle vague or unspecific information when certain parts of a system are unknown or ambiguous and, therefore, they cannot be measured in a reliable manner. Also, when the variation of a variable can produce an alteration on the others The main focus of this paper is to prove the importance of these techniques formulated from a theoretical analysis on its application on ambiguous situations in the field of the rich clusters of galaxies. The purpose is to show its applicability in the several classification systems proposed for the rich clusters, which are based on criteria such as the level of richness of the cluster, the distribution of the brightest galaxies, whether there are signs of type-cD galaxies or not or the existence of sub-clusters. Fuzzy logic enables the researcher to work with "imprecise" information implementing fuzzy sets and combining rules to define actions. The control systems based on fuzzy logic join input variables that are defined in terms of fuzzy sets through rule groups that produce one or several output values of the system under study. From this context, the application of the fuzzy logic's techniques approximates the solution of the mathematical models in abstractions about the rich galaxy cluster classification of physical properties in order to solve the obscurities that must be confronted by an investigation group in order to make a decision.

  18. Linear Temporal Logic (LTL) Based Monitoring of Smart Manufacturing Systems

    PubMed Central

    Heddy, Gerald; Huzaifa, Umer; Beling, Peter; Haimes, Yacov; Marvel, Jeremy; Weiss, Brian; LaViers, Amy

    2017-01-01

    The vision of Smart Manufacturing Systems (SMS) includes collaborative robots that can adapt to a range of scenarios. This vision requires a classification of multiple system behaviors, or sequences of movement, that can achieve the same high-level tasks. Likewise, this vision presents unique challenges regarding the management of environmental variables in concert with discrete, logic-based programming. Overcoming these challenges requires targeted performance and health monitoring of both the logical controller and the physical components of the robotic system. Prognostics and health management (PHM) defines a field of techniques and methods that enable condition-monitoring, diagnostics, and prognostics of physical elements, functional processes, overall systems, etc. PHM is warranted in this effort given that the controller is vulnerable to program changes, which propagate in unexpected ways, logical runtime exceptions, sensor failure, and even bit rot. The physical component’s health is affected by the wear and tear experienced by machines constantly in motion. The controller’s source of faults is inherently discrete, while the latter occurs in a manner that builds up continuously over time. Such a disconnect poses unique challenges for PHM. This paper presents a robotic monitoring system that captures and resolves this disconnect. This effort leverages supervisory robotic control and model checking with linear temporal logic (LTL), presenting them as a novel monitoring system for PHM. This methodology has been demonstrated in a MATLAB-based simulator for an industry inspired use-case in the context of PHM. Future work will use the methodology to develop adaptive, intelligent control strategies to evenly distribute wear on the joints of the robotic arms, maximizing the life of the system. PMID:28730154

  19. Survey of United States neurosurgical residency program directors.

    PubMed

    Lunsford, L Dade; Kassam, Amin; Chang, Yue-Fang

    2004-02-01

    The field of neurosurgery in the United States faces many challenges. Neurosurgical program directors in the United States represent a logical source for inquiries about manpower issues, the training process, and Residency Review Committee (RRC) oversight. Ninety-one active residency program directors were sent an anonymous 31-question survey. The respondents were given the option of adding additional comments. The questions were designed to address issues related to manpower, the training process, and RRC governance. Sixty-one responses were returned before an email reminder and 11 after the reminder (a total response rate of 79%). The data were entered into a database, and a descriptive analysis, with frequency distribution, was performed. The purpose of this review was to gain a preliminary understanding of the perceptions of program directors regarding the neurosurgical training process, the RRC, the oversight process, and projected manpower needs. A 79% response rate is high for a mail survey and likely reflects heightened concern and interest in such issues. The survey responses indicate general satisfaction with the role and governance of the RRC, significantly divergent perceptions of resident output and available positions, and serious concerns regarding the current training process. This survey suggests that a broader discussion of resident training issues would be valuable, perhaps using validated survey instruments.

  20. Eliciting and Combining Decision Criteria Using a Limited Palette of Utility Functions and Uncertainty Distributions: Illustrated by Application to Pest Risk Analysis.

    PubMed

    Holt, Johnson; Leach, Adrian W; Schrader, Gritta; Petter, Françoise; MacLeod, Alan; van der Gaag, Dirk Jan; Baker, Richard H A; Mumford, John D

    2014-01-01

    Utility functions in the form of tables or matrices have often been used to combine discretely rated decision-making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments. © 2013 Society for Risk Analysis.

  1. Strategies for searching medical natural language text. Distribution of words in the anatomic diagnoses of 7000 autopsy subjects.

    PubMed Central

    Moore, G. W.; Hutchins, G. M.; Miller, R. E.

    1984-01-01

    Computerized indexing and retrieval of medical records is increasingly important; but the use of natural language versus coded languages (SNOP, SNOMED) for this purpose remains controversial. In an effort to develop search strategies for natural language text, the authors examined the anatomic diagnosis reports by computer for 7000 consecutive autopsy subjects spanning a 13-year period at The Johns Hopkins Hospital. There were 923,657 words, 11,642 of them distinct. The authors observed an average of 1052 keystrokes, 28 lines, and 131 words per autopsy report, with an average 4.6 words per line and 7.0 letters per word. The entire text file represented 921 hours of secretarial effort. Words ranged in frequency from 33,959 occurrences of "and" to one occurrence for each of 3398 different words. Searches for rare diseases with unique names or for representative examples of common diseases were most readily performed with the use of computer-printed key word in context (KWIC) books. For uncommon diseases designated by commonly used terms (such as "cystic fibrosis"), needs were best served by a computerized search for logical combinations of key words. In an unbalanced word distribution, each conjunction (logical and) search should be performed in ascending order of word frequency; but each alternation (logical inclusive or) search should be performed in descending order of word frequency. Natural language text searches will assume a larger role in medical records analysis as the labor-intensive procedure of translation into a coded language becomes more costly, compared with the computer-intensive procedure of text searching. PMID:6546837

  2. 3VSR: Three Valued Secure Routing for Vehicular Ad Hoc Networks using Sensing Logic in Adversarial Environment

    PubMed Central

    Wang, Liangmin

    2018-01-01

    Today IoT integrate thousands of inter networks and sensing devices e.g., vehicular networks, which are considered to be challenging due to its high speed and network dynamics. The goal of future vehicular networks is to improve road safety, promote commercial or infotainment products and to reduce the traffic accidents. All these applications are based on the information exchange among nodes, so not only reliable data delivery but also the authenticity and credibility of the data itself are prerequisite. To cope with the aforementioned problem, trust management come up as promising candidate to conduct node’s transaction and interaction management, which requires distributed mobile nodes cooperation for achieving design goals. In this paper, we propose a trust-based routing protocol i.e., 3VSR (Three Valued Secure Routing), which extends the widely used AODV (Ad hoc On-demand Distance Vector) routing protocol and employs the idea of Sensing Logic-based trust model to enhance the security solution of VANET (Vehicular Ad-Hoc Network). The existing routing protocol are mostly based on key or signature-based schemes, which off course increases computation overhead. In our proposed 3VSR, trust among entities is updated frequently by means of opinion derived from sensing logic due to vehicles random topologies. In 3VSR the theoretical capabilities are based on Dirichlet distribution by considering prior and posterior uncertainty of the said event. Also by using trust recommendation message exchange, nodes are able to reduce computation and routing overhead. The simulated results shows that the proposed scheme is secure and practical. PMID:29538314

  3. 3VSR: Three Valued Secure Routing for Vehicular Ad Hoc Networks using Sensing Logic in Adversarial Environment.

    PubMed

    Sohail, Muhammad; Wang, Liangmin

    2018-03-14

    Today IoT integrate thousands of inter networks and sensing devices e.g., vehicular networks, which are considered to be challenging due to its high speed and network dynamics. The goal of future vehicular networks is to improve road safety, promote commercial or infotainment products and to reduce the traffic accidents. All these applications are based on the information exchange among nodes, so not only reliable data delivery but also the authenticity and credibility of the data itself are prerequisite. To cope with the aforementioned problem, trust management come up as promising candidate to conduct node's transaction and interaction management, which requires distributed mobile nodes cooperation for achieving design goals. In this paper, we propose a trust-based routing protocol i.e., 3VSR (Three Valued Secure Routing), which extends the widely used AODV (Ad hoc On-demand Distance Vector) routing protocol and employs the idea of Sensing Logic-based trust model to enhance the security solution of VANET (Vehicular Ad-Hoc Network). The existing routing protocol are mostly based on key or signature-based schemes, which off course increases computation overhead. In our proposed 3VSR, trust among entities is updated frequently by means of opinion derived from sensing logic due to vehicles random topologies. In 3VSR the theoretical capabilities are based on Dirichlet distribution by considering prior and posterior uncertainty of the said event. Also by using trust recommendation message exchange, nodes are able to reduce computation and routing overhead. The simulated results shows that the proposed scheme is secure and practical.

  4. Design of 4 to 2 line encoder using lithium niobate based Mach Zehnder Interferometers for high speed communication

    NASA Astrophysics Data System (ADS)

    Pal, Amrindra; Kumar, Santosh; Sharma, Sandeep; Raghuwanshi, Sanjeev K.

    2016-04-01

    Encoder is a device that allows placing digital information from many inputs to many outputs. Any application of combinational logic circuit can be implemented by using encoder and external gates. In this paper, 4 to 2 line encoder is proposed using electro-optic effect inside lithium-niobate based Mach-Zehnder interferometers (MZIs). The MZI structures have powerful capability to switching an optical input signal to a desired output port. The paper constitutes a mathematical description of the proposed device and thereafter simulation using MATLAB. The study is verified using beam propagation method (BPM).

  5. Methods and application of system identification in shock and vibration.

    NASA Technical Reports Server (NTRS)

    Collins, J. D.; Young, J. P.; Kiefling, L.

    1972-01-01

    A logical picture is presented of current useful system identification techniques in the shock and vibration field. A technology tree diagram is developed for the purpose of organizing and categorizing the widely varying approaches according to the fundamental nature of each. Specific examples of accomplished activity for each identification category are noted and discussed. To provide greater insight into the most current trends in the system identification field, a somewhat detailed description is presented of the essential features of a recently developed technique that is based on making the maximum use of all statistically known information about a system.

  6. Space station common module power system network topology and hardware development

    NASA Technical Reports Server (NTRS)

    Landis, D. M.

    1985-01-01

    Candidate power system newtork topologies for the space station common module are defined and developed and the necessary hardware for test and evaluation is provided. Martin Marietta's approach to performing the proposed program is presented. Performance of the tasks described will assure systematic development and evaluation of program results, and will provide the necessary management tools, visibility, and control techniques for performance assessment. The plan is submitted in accordance with the data requirements given and includes a comprehensive task logic flow diagram, time phased manpower requirements, a program milestone schedule, and detailed descriptions of each program task.

  7. Overview of Threats and Failure Models for Safety-Relevant Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document presents a high-level overview of the threats to safety-relevant computer-based systems, including (1) a description of the introduction and activation of physical and logical faults; (2) the propagation of their effects; and (3) function-level and component-level error and failure mode models. These models can be used in the definition of fault hypotheses (i.e., assumptions) for threat-risk mitigation strategies. This document is a contribution to a guide currently under development that is intended to provide a general technical foundation for designers and evaluators of safety-relevant systems.

  8. Statechart-based design controllers for FPGA partial reconfiguration

    NASA Astrophysics Data System (ADS)

    Łabiak, Grzegorz; Wegrzyn, Marek; Rosado Muñoz, Alfredo

    2015-09-01

    Statechart diagram and UML technique can be a vital part of early conceptual modeling. At the present time there is no much support in hardware design methodologies for reconfiguration features of reprogrammable devices. Authors try to bridge the gap between imprecise UML model and formal HDL description. The key concept in author's proposal is to describe the behavior of the digital controller by statechart diagrams and to map some parts of the behavior into reprogrammable logic by means of group of states which forms sequential automaton. The whole process is illustrated by the example with experimental results.

  9. Control and protection system for paralleled modular static inverter-converter systems

    NASA Technical Reports Server (NTRS)

    Birchenough, A. G.; Gourash, F.

    1973-01-01

    A control and protection system was developed for use with a paralleled 2.5-kWe-per-module static inverter-converter system. The control and protection system senses internal and external fault parameters such as voltage, frequency, current, and paralleling current unbalance. A logic system controls contactors to isolate defective power conditioners or loads. The system sequences contactor operation to automatically control parallel operation, startup, and fault isolation. Transient overload protection and fault checking sequences are included. The operation and performance of a control and protection system, with detailed circuit descriptions, are presented.

  10. A proposal to describe a phenomenon of expanding language

    NASA Astrophysics Data System (ADS)

    Swietorzecka, Kordula

    Changes of knowledge, convictions or beliefs are subjects of interest in frame of so called epistemic logic. There are various proposed descriptions of a process (or its results) in which so a called agent may invent certain changes in a set of sentences that he had already chosen as a point of his knowledge, convictions or beliefs (and this is also considered in case of many agents). In the presented paper we are interested in the changeability of an agent's language which is by its own independent from already mentioned changes. Modern epistemic formalizations assume that the agent uses a fixed (and so we could say: static) language in which he expresses his various opinions which may change. Our interest is to simulate a situation when a language is extended by adding to it new expressions which were not known by the agent so he couldn't even consider them as subjects of his opinions. Actually such a phenomenon happens both in natural and scientific languages. Let us mention a fact of expanding languages in process of learning or in result of getting of new data about some described domain. We propose a simple idealization of extending sentential language used by one agent. Actually the language is treated as a family of so called n-languages which get some epistemic interpretation. Proposed semantics enables us to distinguish between two different types of changes - these which occur because of changing agent's convictions about logical values of some n-sentences - we describe them using one place operator C to be read it changes that - and changes that consist in increasing the level of n-language by adding to it new expressions. However the second type of change - symbolized by variable G - may be also considered independently of the first one. The logical frame of our considerations comes from and it was originally used to describe Aristotelian theory of substantial changes. This time we apply the mentioned logic in epistemology.

  11. Synthesis of energy-efficient FSMs implemented in PLD circuits

    NASA Astrophysics Data System (ADS)

    Nawrot, Radosław; Kulisz, Józef; Kania, Dariusz

    2017-11-01

    The paper presents an outline of a simple synthesis method of energy-efficient FSMs. The idea consists in using local clock gating to selectively block the clock signal, if no transition of a state of a memory element is required. The research was dedicated to logic circuits using Programmable Logic Devices as the implementation platform, but the conclusions can be applied to any synchronous circuit. The experimental section reports a comparison of three methods of implementing sequential circuits in PLDs with respect to clock distribution: the classical fully synchronous structure, the structure exploiting the Enable Clock inputs of memory elements, and the structure using clock gating. The results show that the approach based on clock gating is the most efficient one, and it leads to significant reduction of dynamic power consumed by the FSM.

  12. A Logically Centralized Approach for Control and Management of Large Computer Networks

    ERIC Educational Resources Information Center

    Iqbal, Hammad A.

    2012-01-01

    Management of large enterprise and Internet service provider networks is a complex, error-prone, and costly challenge. It is widely accepted that the key contributors to this complexity are the bundling of control and data forwarding in traditional routers and the use of fully distributed protocols for network control. To address these…

  13. Scholarly Communication and the Dilemma of Collective Action: Why Academic Journals Cost Too Much

    ERIC Educational Resources Information Center

    Wenzler, John

    2017-01-01

    Why has the rise of the Internet--which drastically reduces the cost of distributing information--coincided with drastic increases in the prices that academic libraries pay for access to scholarly journals? This study argues that libraries are trapped in a collective action dilemma as defined by economist Mancur Olson in "The Logic of…

  14. A framework for real-time distributed expert systems: On-orbit spacecraft fault diagnosis, monitoring and control

    NASA Technical Reports Server (NTRS)

    Mullikin, Richard L.

    1987-01-01

    Control of on-orbit operation of a spacecraft requires retention and application of special purpose, often unique, knowledge of equipment and procedures. Real-time distributed expert systems (RTDES) permit a modular approach to a complex application such as on-orbit spacecraft support. One aspect of a human-machine system that lends itself to the application of RTDES is the function of satellite/mission controllers - the next logical step toward the creation of truly autonomous spacecraft systems. This system application is described.

  15. Distributed Database Control and Allocation. Volume 3. Distributed Database System Designer’s Handbook.

    DTIC Science & Technology

    1983-10-01

    Multiversion Data 2-18 2.7.1 Multiversion Timestamping 2-20 2.T.2 Multiversion Looking 2-20 2.8 Combining the Techniques 2-22 3. Database Recovery Algorithms...See rTHEM79, GIFF79] for details. 2.7 Multiversion Data Let us return to a database system model where each logical data item is stored at one DM...In a multiversion database each Write wifxl, produces a new copy (or version) of x, denoted xi. Thus, the value of z is a set of ver- sions. For each

  16. Spatial Description of Drinking Water Bacterial Community Structures in Bulk Water Samples Collected in a Metropolitan Distribution System

    EPA Science Inventory

    The description of microorganisms inhabiting drinking water distribution systems has commonly been performed using techniques that are biased towards easy to culture bacterial populations. As most environmental microorganisms cannot be grown on artificial media, our understanding...

  17. Inheritance on processes, exemplified on distributed termination detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomsen, K.S.

    1987-02-01

    A multiple inheritance mechanism on processes is designed and presented within the framework of a small object oriented language. Processes are described in classes, and the different action parts of a process inherited from different classes are executed in a coroutine-like style called alternation. The inheritance mechanism is a useful tool for factorizing the description of common aspects of processes. This is demonstrated within the domain of distributed programming by using the inheritance mechanism to factorize the description of distributed termination detection algorithms from the description of the distributed main computations for which termination is to be detected. A clearmore » separation of concerns is obtained, and arbitrary combinations of terminations detection algorithms and main computations can be formed. The same termination detection classes can also be used for more general purposes within distributed programming, such as detecting termination of each phase in a multi-phase main computation.« less

  18. Risk society and the distribution of bads: theorizing class in the risk society.

    PubMed

    Curran, Dean

    2013-03-01

    Ulrich Beck states in the Risk Society (1992) that the rise of the social production of risks in the risk society signals that class ceases to be of relevance; instead the hierarchical logic of class will be supplanted by the egalitarian logic of the distribution of risks. Several trenchant critiques of Beck's claim have justified the continued relevance of class to contemporary society. While these accounts have emphasized continuity, they have not attempted to chart, as this paper will, how the growing social production of risk increases the importance of class. This paper argues that it is Beck's undifferentiated, catastrophic account of risk that undergirds his rejection of class, and that by inserting an account of risk involving gradations in both damages and calculability into Beck's framework, his theory of risk society may be used to develop a critical theory of class. Such a theory can be used to reveal how wealth differentials associated with class relations actually increase in importance to individuals' life-chances in the risk society. With the growing production and distribution of bads, class inequalities gain added significance, since it will be relative wealth differentials that both enables the advantaged to minimize their risk exposure and imposes on others the necessity of facing the intensified risks of the risk society. © London School of Economics and Political Science 2013.

  19. Spatial interpolation and radiological mapping of ambient gamma dose rate by using artificial neural networks and fuzzy logic methods.

    PubMed

    Yeşilkanat, Cafer Mert; Kobya, Yaşar; Taşkın, Halim; Çevik, Uğur

    2017-09-01

    The aim of this study was to determine spatial risk dispersion of ambient gamma dose rate (AGDR) by using both artificial neural network (ANN) and fuzzy logic (FL) methods, compare the performances of methods, make dose estimations for intermediate stations with no previous measurements and create dose rate risk maps of the study area. In order to determine the dose distribution by using artificial neural networks, two main networks and five different network structures were used; feed forward ANN; Multi-layer perceptron (MLP), Radial basis functional neural network (RBFNN), Quantile regression neural network (QRNN) and recurrent ANN; Jordan networks (JN), Elman networks (EN). In the evaluation of estimation performance obtained for the test data, all models appear to give similar results. According to the cross-validation results obtained for explaining AGDR distribution, Pearson's r coefficients were calculated as 0.94, 0.91, 0.89, 0.91, 0.91 and 0.92 and RMSE values were calculated as 34.78, 43.28, 63.92, 44.86, 46.77 and 37.92 for MLP, RBFNN, QRNN, JN, EN and FL, respectively. In addition, spatial risk maps showing distributions of AGDR of the study area were created by all models and results were compared with geological, topological and soil structure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Representation of research hypotheses

    PubMed Central

    2011-01-01

    Background Hypotheses are now being automatically produced on an industrial scale by computers in biology, e.g. the annotation of a genome is essentially a large set of hypotheses generated by sequence similarity programs; and robot scientists enable the full automation of a scientific investigation, including generation and testing of research hypotheses. Results This paper proposes a logically defined way for recording automatically generated hypotheses in machine amenable way. The proposed formalism allows the description of complete hypotheses sets as specified input and output for scientific investigations. The formalism supports the decomposition of research hypotheses into more specialised hypotheses if that is required by an application. Hypotheses are represented in an operational way – it is possible to design an experiment to test them. The explicit formal description of research hypotheses promotes the explicit formal description of the results and conclusions of an investigation. The paper also proposes a framework for automated hypotheses generation. We demonstrate how the key components of the proposed framework are implemented in the Robot Scientist “Adam”. Conclusions A formal representation of automatically generated research hypotheses can help to improve the way humans produce, record, and validate research hypotheses. Availability http://www.aber.ac.uk/en/cs/research/cb/projects/robotscientist/results/ PMID:21624164

  1. Fully digital routing logic for single-photon avalanche diode arrays in highly efficient time-resolved imaging

    NASA Astrophysics Data System (ADS)

    Cominelli, Alessandro; Acconcia, Giulia; Ghioni, Massimo; Rech, Ivan

    2018-03-01

    Time-correlated single-photon counting (TCSPC) is a powerful optical technique, which permits recording fast luminous signals with picosecond precision. Unfortunately, given its repetitive nature, TCSPC is recognized as a relatively slow technique, especially when a large time-resolved image has to be recorded. In recent years, there has been a fast trend toward the development of TCPSC imagers. Unfortunately, present systems still suffer from a trade-off between number of channels and performance. Even worse, the overall measurement speed is still limited well below the saturation of the transfer bandwidth toward the external processor. We present a routing algorithm that enables a smart connection between a 32×32 detector array and five shared high-performance converters able to provide an overall conversion rate up to 10 Gbit/s. The proposed solution exploits a fully digital logic circuit distributed in a tree structure to limit the number and length of interconnections, which is a major issue in densely integrated circuits. The behavior of the logic has been validated by means of a field-programmable gate array, while a fully integrated prototype has been designed in 180-nm technology and analyzed by means of postlayout simulations.

  2. Uncertainty vs. Information (Invited)

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  3. Optically programmable encoder based on light propagation in two-dimensional regular nanoplates.

    PubMed

    Li, Ya; Zhao, Fangyin; Guo, Shuai; Zhang, Yongyou; Niu, Chunhui; Zeng, Ruosheng; Zou, Bingsuo; Zhang, Wensheng; Ding, Kang; Bukhtiar, Arfan; Liu, Ruibin

    2017-04-07

    We design an efficient optically controlled microdevice based on CdSe nanoplates. Two-dimensional CdSe nanoplates exhibit lighting patterns around the edges and can be realized as a new type of optically controlled programmable encoder. The light source is used to excite the nanoplates and control the logical position under vertical pumping mode by the objective lens. At each excitation point in the nanoplates, the preferred light-propagation routes are along the normal direction and perpendicular to the edges, which then emit out from the edges to form a localized lighting section. The intensity distribution around the edges of different nanoplates demonstrates that the lighting part with a small scale is much stronger, defined as '1', than the dark section, defined as '0', along the edge. These '0' and '1' are the basic logic elements needed to compose logically functional devices. The observed propagation rules are consistent with theoretical simulations, meaning that the guided-light route in two-dimensional semiconductor nanoplates is regular and predictable. The same situation was also observed in regular CdS nanoplates. Basic theoretical analysis and experiments prove that the guided light and exit position follow rules mainly originating from the shape rather than material itself.

  4. A fuzzy logic control in adjustable autonomy of a multi-agent system for an automated elderly movement monitoring application.

    PubMed

    Mostafa, Salama A; Mustapha, Aida; Mohammed, Mazin Abed; Ahmad, Mohd Sharifuddin; Mahmoud, Moamin A

    2018-04-01

    Autonomous agents are being widely used in many systems, such as ambient assisted-living systems, to perform tasks on behalf of humans. However, these systems usually operate in complex environments that entail uncertain, highly dynamic, or irregular workload. In such environments, autonomous agents tend to make decisions that lead to undesirable outcomes. In this paper, we propose a fuzzy-logic-based adjustable autonomy (FLAA) model to manage the autonomy of multi-agent systems that are operating in complex environments. This model aims to facilitate the autonomy management of agents and help them make competent autonomous decisions. The FLAA model employs fuzzy logic to quantitatively measure and distribute autonomy among several agents based on their performance. We implement and test this model in the Automated Elderly Movements Monitoring (AEMM-Care) system, which uses agents to monitor the daily movement activities of elderly users and perform fall detection and prevention tasks in a complex environment. The test results show that the FLAA model improves the accuracy and performance of these agents in detecting and preventing falls. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Valuing EQ-5D-5L health states 'in context' using a discrete choice experiment.

    PubMed

    Cole, Amanda; Shah, Koonal; Mulhern, Brendan; Feng, Yan; Devlin, Nancy

    2018-05-01

    In health state valuation studies, health states are typically presented as a series of sentences, each describing a health dimension and severity 'level'. Differences in the severity levels can be subtle, and confusion about which is 'worse' can lead to logically inconsistent valuation data. A solution could be to mimic the way patients self-report health, where the ordinal structure of levels is clear. We develop and test the feasibility of presenting EQ-5D-5L health states in the 'context' of the entire EQ-5D-5L descriptive system. An online two-arm discrete choice experiment was conducted in the UK (n = 993). Respondents were randomly allocated to a control (standard presentation) or 'context' arm. Each respondent completed 16 paired comparison tasks and feedback questions about the tasks. Differences across arms were assessed using regression analyses. Presenting health states 'in context' can significantly reduce the selection of logically dominated health states, particularly for labels 'severe' and 'extreme' (χ 2  = 46.02, p < 0.001). Preferences differ significantly between arms (likelihood ratio statistic = 42.00, p < 0.05). Comparing conditional logit modeling results, coefficients are ordered as expected for both arms, but the magnitude of decrements between levels is larger for the context arm. Health state presentation is a key consideration in the design of valuation studies. Presenting health states 'in context' affects valuation data and reduces logical inconsistencies. Our results could have implications for other valuation tasks such as time trade-off, and for the valuation of other preference-based measures.

  6. Experimental Verification of Electric Drive Technologies Based on Artificial Intelligence Tools

    NASA Technical Reports Server (NTRS)

    Rubaai, Ahmed; Kankam, David (Technical Monitor)

    2003-01-01

    A laboratory implementation of a fuzzy logic-tracking controller using a low cost Motorola MC68HC11E9 microprocessor is described in this report. The objective is to design the most optimal yet practical controller that can be implemented and marketed, and which gives respectable performance, even when the system loads, inertia and parameters are varying. A distinguishing feature of this work is the by-product goal of developing a marketable, simple, functional and low cost controller. Additionally, real-time nonlinearities are not ignored, and a mathematical model is not required. A number of components have been designed, built and tested individually, and in various combinations of hardware and software segments. These components have been integrated with a brushless motor to constitute the drive system. A microprocessor-based FLC is incorporated to provide robust speed and position control. Design objectives that are difficult to express mathematically can be easily incorporated in a fuzzy logic-based controller by linguistic information (in the form of fuzzy IF-THEN rules). The theory and design are tested in the laboratory using a hardware setup. Several test cases have been conducted to confirm the effectiveness of the proposed controller. The results indicate excellent tracking performance for both speed and position trajectories. For the purpose of comparison, a bang-bang controller has been tested. The fuzzy logic controller performs significantly better than the traditional bang-bang controller. The bang-bang controller has been shown to be relatively inaccurate and lacking in robustness. Description of the implementation hardware system is also given.

  7. A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics

    NASA Astrophysics Data System (ADS)

    Haaf, Ezra; Heudorfer, Benedikt; Stahl, Kerstin; Barthel, Roland

    2017-04-01

    Fluctuation patterns in groundwater hydrographs are generally assumed to contain information on aquifer characteristics, climate and environmental controls. However, attempts to disentangle this information and map the dominant controls have been few. This is due to the substantial heterogeneity and complexity of groundwater systems, which is reflected in the abundance of morphologies of groundwater time series. To describe the structure and shape of hydrographs, descriptive terms like "slow"/ "fast" or "flashy"/ "inert" are frequently used, which are subjective, irreproducible and limited. This lack of objective and refined concepts limit approaches for regionalization of hydrogeological characteristics as well as our understanding of dominant processes controlling groundwater dynamics. Therefore, we propose a novel framework for groundwater hydrograph characterization in an attempt to categorize morphologies explicitly and quantitatively based on perceptual concepts of aspects of the dynamics. This quantitative framework is inspired by the existing and operational eco-hydrological classification frameworks for streamflow. The need for a new framework for groundwater systems is justified by the fundamental differences between the state variable groundwater head and the flow variable streamflow. Conceptually, we extracted exemplars of specific dynamic patterns, attributing descriptive terms for means of systematisation. Metrics, primarily taken from streamflow literature, were subsequently adapted to groundwater and assigned to the described patterns for means of quantification. In this study, we focused on the particularities of groundwater as a state variable. Furthermore, we investigated the descriptive skill of individual metrics as well as their usefulness for groundwater hydrographs. The ensemble of categorized metrics result in a framework, which can be used to describe and quantify groundwater dynamics. It is a promising tool for the setup of a successful similarity classification framework for groundwater hydrographs. However, the overabundance of metrics available calls for a systematic redundancy analysis of the metrics, which we describe in a second study (Heudorfer et al., 2017). Heudorfer, B., Haaf, E., Barthel, R., Stahl, K., 2017. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.

  8. Trade-off decisions in distribution utility management

    NASA Astrophysics Data System (ADS)

    Slavickas, Rimas Anthony

    As a result of the "unbundling" of traditional monopolistic electricity generation and transmission enterprises into a free-market economy, power distribution utilities are faced with very difficult decisions pertaining to electricity supply options and quality of service to the customers. The management of distribution utilities has become increasingly complex, versatile, and dynamic to the extent that conventional, non-automated management tools are almost useless and obsolete. This thesis presents a novel and unified approach to managing electricity supply options and quality of service to customers. The technique formulates the problem in terms of variables, parameters, and constraints. An advanced Mixed Integer Programming (MIP) optimization formulation is developed together with novel, logical, decision-making algorithms. These tools enable the utility management to optimize various cost components and assess their time-trend impacts, taking into account the intangible issues such as customer perception, customer expectation, social pressures, and public response to service deterioration. The above concepts are further generalized and a Logical Proportion Analysis (LPA) methodology and associated software have been developed. Solutions using numbers are replaced with solutions using words (character strings) which more closely emulate the human decision-making process and advance the art of decision-making in the power utility environment. Using practical distribution utility operation data and customer surveys, the developments outlined in this thesis are successfully applied to several important utility management problems. These involve the evaluation of alternative electricity supply options, the impact of rate structures on utility business, and the decision of whether to continue to purchase from a main grid or generate locally (partially or totally) by building Non-Utility Generation (NUG).

  9. 121. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    121. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ROOM (206), LSB (BLDG. 751). FEATURES LEFT TO RIGHT: FACILITY DISTRIBUTION CONSOLE FOR WATER CONTROL SYSTEMS, PROPULSION ELECTRICAL CHECKOUT SYSTEM (PECOS), LOGIC CONTROL AND MONITOR UNITS FOR BOOSTER AND FUEL SYSTEMS. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  10. OpenFlow Extensions for Programmable Quantum Networks

    DTIC Science & Technology

    2017-06-19

    Extensions for Programmable Quantum Networks by Venkat Dasari, Nikolai Snow, and Billy Geerhart Computational and Information Sciences Directorate...distribution is unlimited. 1 1. Introduction Quantum networks and quantum computing have been receiving a surge of interest recently.1–3 However, there has...communicate using entangled particles and perform calculations using quantum logic gates. Additionally, quantum computing uses a quantum bit (qubit

  11. Logistics Composite Model (LCOM) Workbook

    DTIC Science & Technology

    1976-06-01

    nc;-J to e UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE (When Dots Ew.ervol. REPORT DOCUMENTATION PAGE READ INSTRUCTIONS BEFORE COMPLETING FORM I...Controlling Office) 15. SECURITY CLASS. (of this report) Unclassified 158. DECLASSIFICATIONDOWNGRADING SCHEDULE 16. DISTRIBUTION STATEMENT (of this Report...ID SECURITY CLASSIFICATION OF THIS PAGE (When Des Entered) K7 UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE(Whon Dal Entore o l Logic networks

  12. Microgrid Enabled Distributed Energy Solutions (MEDES) Fort Bliss Military Reservation

    DTIC Science & Technology

    2014-02-01

    Logic Controller PF Power Factor PO Performance Objectives PPA Power Purchase Agreements PV Photovoltaic R&D Research and Development RDSI...controller, algorithms perform power flow analysis, short term optimization, and long-term forecasted planning. The power flow analysis ensures...renewable photovoltaic power and energy storage in this microgrid configuration, the available mission operational time of the backup generator can be

  13. Quasi-State Monopoly of the Education System and Socio-Economic Segregation in Argentina

    ERIC Educational Resources Information Center

    Narodowski, Mariano; Gottau, Verónica; Moschetti, Mauro

    2016-01-01

    This paper analyses the provision of education in Argentina in systemic terms. Using the concept of quasi-monopoly and the notions of exit, voice and loyalty, we study the logic of organization and distribution of students within the educational system. We support the idea that the provision of private and public education makes a coherent whole,…

  14. A genetic code Boolean structure. II. The genetic information system as a Boolean information system.

    PubMed

    Sanchez, Robersy; Grau, Ricardo

    2005-09-01

    A Boolean structure of the genetic code where Boolean deductions have biological and physicochemical meanings was discussed in a previous paper. Now, from these Boolean deductions we propose to define the value of amino acid information in order to consider the genetic information system as a communication system and to introduce the semantic content of information ignored by the conventional information theory. In this proposal, the value of amino acid information is proportional to the molecular weight of amino acids with a proportional constant of about 1.96 x 10(25) bits per kg. In addition to this, for the experimental estimations of the minimum energy dissipation in genetic logic operations, we present two postulates: (1) the energy Ei (i=1,2,...,20) of amino acids in the messages conveyed by proteins is proportional to the value of information, and (2) amino acids are distributed according to their energy Ei so the amino acid population in proteins follows a Boltzmann distribution. Specifically, in the genetic message carried by the DNA from the genomes of living organisms, we found that the minimum energy dissipation in genetic logic operations was close to kTLn(2) joules per bit.

  15. Theoretical and experimental investigations of coincidences in Poisson distributed pulse trains and spectral distortion caused by pulse pileup

    NASA Astrophysics Data System (ADS)

    Bristow, Quentin

    1990-03-01

    The occurrence rates of pulse strings, or sequences of pulses with interarrival times less than the resolving time of the pulse-height analysis system used to acquire spectra, are derived from theoretical considerations. Logic circuits were devised to make experimental measurements of multiple pulse string occurrence rates in the output from a scintillation detector over a wide range of count rates. Markov process theory was used to predict state transition rates in the logic circuits, enabling the experimental data to be checked rigorously for conformity with those predicted for a Poisson distribution. No fundamental discrepancies were observed. Monte Carlo simulations, incorporating criteria for pulse pileup inherent in the operation of modern analog to digital converters, were used to generate pileup spectra due to coincidences between two pulses (first order pileup) and three pulses (second order pileup) for different semi-Gaussian pulse shapes. Coincidences between pulses in a single channel produced a basic probability density function spectrum. The use of a flat spectrum showed the first order pileup distorted the spectrum to a linear ramp with a pileup tail. A correction algorithm was successfully applied to correct entire spectra (simulated and real) for first and second order pileups.

  16. Database technology and the management of multimedia data in the Mirror project

    NASA Astrophysics Data System (ADS)

    de Vries, Arjen P.; Blanken, H. M.

    1998-10-01

    Multimedia digital libraries require an open distributed architecture instead of a monolithic database system. In the Mirror project, we use the Monet extensible database kernel to manage different representation of multimedia objects. To maintain independence between content, meta-data, and the creation of meta-data, we allow distribution of data and operations using CORBA. This open architecture introduces new problems for data access. From an end user's perspective, the problem is how to search the available representations to fulfill an actual information need; the conceptual gap between human perceptual processes and the meta-data is too large. From a system's perspective, several representations of the data may semantically overlap or be irrelevant. We address these problems with an iterative query process and active user participating through relevance feedback. A retrieval model based on inference networks assists the user with query formulation. The integration of this model into the database design has two advantages. First, the user can query both the logical and the content structure of multimedia objects. Second, the use of different data models in the logical and the physical database design provides data independence and allows algebraic query optimization. We illustrate query processing with a music retrieval application.

  17. Heat wave hazard classification and risk assessment using artificial intelligence fuzzy logic.

    PubMed

    Keramitsoglou, Iphigenia; Kiranoudis, Chris T; Maiheu, Bino; De Ridder, Koen; Daglis, Ioannis A; Manunta, Paolo; Paganini, Marc

    2013-10-01

    The average summer temperatures as well as the frequency and intensity of hot days and heat waves are expected to increase due to climate change. Motivated by this consequence, we propose a methodology to evaluate the monthly heat wave hazard and risk and its spatial distribution within large cities. A simple urban climate model with assimilated satellite-derived land surface temperature images was used to generate a historic database of urban air temperature fields. Heat wave hazard was then estimated from the analysis of these hourly air temperatures distributed at a 1-km grid over Athens, Greece, by identifying the areas that are more likely to suffer higher temperatures in the case of a heat wave event. Innovation lies in the artificial intelligence fuzzy logic model that was used to classify the heat waves from mild to extreme by taking into consideration their duration, intensity and time of occurrence. The monthly hazard was subsequently estimated as the cumulative effect from the individual heat waves that occurred at each grid cell during a month. Finally, monthly heat wave risk maps were produced integrating geospatial information on the population vulnerability to heat waves calculated from socio-economic variables.

  18. Storage and long-distance distribution of telecommunications-band polarization entanglement generated in an optical fiber.

    PubMed

    Li, Xiaoying; Voss, Paul L; Chen, Jun; Sharping, Jay E; Kumar, Prem

    2005-05-15

    We demonstrate storage of polarization-entangled photons for 125 micros, a record storage time to date, in a 25-km-long fiber spool, using a telecommunications-band fiber-based source of entanglement. With this source we also demonstrate distribution of polarization entanglement over 50 km by separating the two photons of an entangled pair and transmitting them individually over separate 25-km fibers. The measured two-photon fringe visibilities were 82% in the storage experiment and 86% in the distribution experiment. Preservation of polarization entanglement over such long-distance transmission demonstrates the viability of all-fiber sources for use in quantum memories and quantum logic gates.

  19. Fundamentals of Research Data and Variables: The Devil Is in the Details.

    PubMed

    Vetter, Thomas R

    2017-10-01

    Designing, conducting, analyzing, reporting, and interpreting the findings of a research study require an understanding of the types and characteristics of data and variables. Descriptive statistics are typically used simply to calculate, describe, and summarize the collected research data in a logical, meaningful, and efficient way. Inferential statistics allow researchers to make a valid estimate of the association between an intervention and the treatment effect in a specific population, based upon their randomly collected, representative sample data. Categorical data can be either dichotomous or polytomous. Dichotomous data have only 2 categories, and thus are considered binary. Polytomous data have more than 2 categories. Unlike dichotomous and polytomous data, ordinal data are rank ordered, typically based on a numerical scale that is comprised of a small set of discrete classes or integers. Continuous data are measured on a continuum and can have any numeric value over this continuous range. Continuous data can be meaningfully divided into smaller and smaller or finer and finer increments, depending upon the precision of the measurement instrument. Interval data are a form of continuous data in which equal intervals represent equal differences in the property being measured. Ratio data are another form of continuous data, which have the same properties as interval data, plus a true definition of an absolute zero point, and the ratios of the values on the measurement scale make sense. The normal (Gaussian) distribution ("bell-shaped curve") is of the most common statistical distributions. Many applied inferential statistical tests are predicated on the assumption that the analyzed data follow a normal distribution. The histogram and the Q-Q plot are 2 graphical methods to assess if a set of data have a normal distribution (display "normality"). The Shapiro-Wilk test and the Kolmogorov-Smirnov test are 2 well-known and historically widely applied quantitative methods to assess for data normality. Parametric statistical tests make certain assumptions about the characteristics and/or parameters of the underlying population distribution upon which the test is based, whereas nonparametric tests make fewer or less rigorous assumptions. If the normality test concludes that the study data deviate significantly from a Gaussian distribution, rather than applying a less robust nonparametric test, the problem can potentially be remedied by judiciously and openly: (1) performing a data transformation of all the data values; or (2) eliminating any obvious data outlier(s).

  20. Identifying patients for clinical trials using fuzzy ternary logic expressions on HL7 messages.

    PubMed

    Majeed, Raphael W; Röhrig, Rainer

    2011-01-01

    Identifying eligible patients is one of the most critical parts of any clinical trial. The process of recruiting patients for the third phase of any clinical trial is usually done manually, informing relevant physicians or putting notes on bulletin boards. While most necessary information is already available in electronic hospital information systems, required data still has to be looked up individually. Most university hospitals make use of a dedicated communication server to distribute information from independent information systems, e.g. laboratory information systems, electronic health records, surgery planning systems. Thus, a theoretical model is developed to formally describe inclusion and exclusion criteria for each clinical trial using a fuzzy ternary logic expression. These expressions will then be used to process HL7 messages from a communication server in order to identify eligible patients.

  1. Variable Delay Element For Jitter Control In High Speed Data Links

    DOEpatents

    Livolsi, Robert R.

    2002-06-11

    A circuit and method for decreasing the amount of jitter present at the receiver input of high speed data links which uses a driver circuit for input from a high speed data link which comprises a logic circuit having a first section (1) which provides data latches, a second section (2) which provides a circuit generates a pre-destorted output and for compensating for level dependent jitter having an OR function element and a NOR function element each of which is coupled to two inputs and to a variable delay element as an input which provides a bi-modal delay for pulse width pre-distortion, a third section (3) which provides a muxing circuit, and a forth section (4) for clock distribution in the driver circuit. A fifth section is used for logic testing the driver circuit.

  2. Fruit Sorting Using Fuzzy Logic Techniques

    NASA Astrophysics Data System (ADS)

    Elamvazuthi, Irraivan; Sinnadurai, Rajendran; Aftab Ahmed Khan, Mohamed Khan; Vasant, Pandian

    2009-08-01

    Fruit and vegetables market is getting highly selective, requiring their suppliers to distribute the goods according to very strict standards of quality and presentation. In the last years, a number of fruit sorting and grading systems have appeared to fulfill the needs of the fruit processing industry. However, most of them are overly complex and too costly for the small and medium scale industry (SMIs) in Malaysia. In order to address these shortcomings, a prototype machine was developed by integrating the fruit sorting, labeling and packing processes. To realise the prototype, many design issues were dealt with. Special attention is paid to the electronic weighing sub-system for measuring weight, and the opto-electronic sub-system for determining the height and width of the fruits. Specifically, this paper discusses the application of fuzzy logic techniques in the sorting process.

  3. Families of Graph Algorithms: SSSP Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanewala Appuhamilage, Thejaka Amila Jay; Zalewski, Marcin J.; Lumsdaine, Andrew

    2017-08-28

    Single-Source Shortest Paths (SSSP) is a well-studied graph problem. Examples of SSSP algorithms include the original Dijkstra’s algorithm and the parallel Δ-stepping and KLA-SSSP algorithms. In this paper, we use a novel Abstract Graph Machine (AGM) model to show that all these algorithms share a common logic and differ from one another by the order in which they perform work. We use the AGM model to thoroughly analyze the family of algorithms that arises from the common logic. We start with the basic algorithm without any ordering (Chaotic), and then we derive the existing and new algorithms by methodically exploringmore » semantic and spatial ordering of work. Our experimental results show that new derived algorithms show better performance than the existing distributed memory parallel algorithms, especially at higher scales.« less

  4. High efficiency coherent optical memory with warm rubidium vapour

    PubMed Central

    Hosseini, M.; Sparkes, B.M.; Campbell, G.; Lam, P.K.; Buchler, B.C.

    2011-01-01

    By harnessing aspects of quantum mechanics, communication and information processing could be radically transformed. Promising forms of quantum information technology include optical quantum cryptographic systems and computing using photons for quantum logic operations. As with current information processing systems, some form of memory will be required. Quantum repeaters, which are required for long distance quantum key distribution, require quantum optical memory as do deterministic logic gates for optical quantum computing. Here, we present results from a coherent optical memory based on warm rubidium vapour and show 87% efficient recall of light pulses, the highest efficiency measured to date for any coherent optical memory suitable for quantum information applications. We also show storage and recall of up to 20 pulses from our system. These results show that simple warm atomic vapour systems have clear potential as a platform for quantum memory. PMID:21285952

  5. High efficiency coherent optical memory with warm rubidium vapour.

    PubMed

    Hosseini, M; Sparkes, B M; Campbell, G; Lam, P K; Buchler, B C

    2011-02-01

    By harnessing aspects of quantum mechanics, communication and information processing could be radically transformed. Promising forms of quantum information technology include optical quantum cryptographic systems and computing using photons for quantum logic operations. As with current information processing systems, some form of memory will be required. Quantum repeaters, which are required for long distance quantum key distribution, require quantum optical memory as do deterministic logic gates for optical quantum computing. Here, we present results from a coherent optical memory based on warm rubidium vapour and show 87% efficient recall of light pulses, the highest efficiency measured to date for any coherent optical memory suitable for quantum information applications. We also show storage and recall of up to 20 pulses from our system. These results show that simple warm atomic vapour systems have clear potential as a platform for quantum memory.

  6. Extending SEQenv: a taxa-centric approach to environmental annotations of 16S rDNA sequences

    PubMed Central

    Jeffries, Thomas C.; Ijaz, Umer Z.; Hamonts, Kelly

    2017-01-01

    Understanding how the environment selects a given taxon and the diversity patterns that emerge as a result of environmental filtering can dramatically improve our ability to analyse any environment in depth as well as advancing our knowledge on how the response of different taxa can impact each other and ecosystem functions. Most of the work investigating microbial biogeography has been site-specific, and logical environmental factors, rather than geographical location, may be more influential on microbial diversity. SEQenv, a novel pipeline aiming to provide environmental annotations of sequences emerged to provide a consistent description of the environmental niches using the ENVO ontology. While the pipeline provides a list of environmental terms on the basis of sample datasets and, therefore, the annotations obtained are at the dataset level, it lacks a taxa centric approach to environmental annotation. The work here describes an extension developed to enhance the SEQenv pipeline, which provided the means to directly generate environmental annotations for taxa under different contexts. 16S rDNA amplicon datasets belonging to distinct biomes were selected to illustrate the applicability of the extended SEQenv pipeline. A literature survey of the results demonstrates the immense importance of sequence level environmental annotations by illustrating the distribution of both taxa across environments as well as the various environmental sources of a specific taxon. Significantly enhancing the SEQenv pipeline in the process, this information would be valuable to any biologist seeking to understand the various taxa present in the habitat and the environment they originated from, enabling a more thorough analysis of which lineages are abundant in certain habitats and the recovery of patterns in taxon distribution across different habitats and environmental gradients. PMID:29038749

  7. Invasion resistance arises in strongly interacting species-rich model competition communities.

    PubMed Central

    Case, T J

    1990-01-01

    I assemble stable multispecies Lotka-Volterra competition communities that differ in resident species number and average strength (and variance) of species interactions. These are then invaded with randomly constructed invaders drawn from the same distribution as the residents. The invasion success rate and the fate of the residents are determined as a function of community-and species-level properties. I show that the probability of colonization success for an invader decreases with community size and the average strength of competition (alpha). Communities composed of many strongly interacting species limit the invasion possibilities of most similar species. These communities, even for a superior invading competitor, set up a sort of "activation barrier" that repels invaders when they invade at low numbers. This "priority effect" for residents is not assumed a priori in my description for the individual population dynamics of these species; rather it emerges because species-rich and strongly interacting species sets have alternative stable states that tend to disfavor species at low densities. These models point to community-level rather than invader-level properties as the strongest determinant of differences in invasion success. The probability of extinction for a resident species increases with community size, and the probability of successful colonization by the invader decreases. Thus an equilibrium community size results wherein the probability of a resident species' extinction just balances the probability of an invader's addition. Given the distribution of alpha it is now possible to predict the equilibrium species number. The results provide a logical framework for an island-biogeographic theory in which species turnover is low even in the face of persistent invasions and for the protection of fragile native species from invading exotics. PMID:11607132

  8. Geographic Health's Way to Prevention of Diseases: A Case Study on Arsenic Spatial Dispersion and Dyspnea in Isfahan Province

    PubMed Central

    Rashidi, Maasoume; Poursafa, Parinaz

    2014-01-01

    Background: As geographic science discusses the analysis of environment, human beings and their mutual relations, thus the field of medical geography consists of being inspired from the relations between these two factors, analyzing environmental factors, their identification them and the state of their effects on human health, as well as determining the location of these factors. Some hazards that threat human health are the results of environmental factors and the relevant pollutions. Some important categories of diseases including (Shortness of Breath or, Dyspnea) have considerable differences in various places, as observed in their spatial prevalence and distribution maps. Methods: The record of patients with Dyspnea diseases were prepared for this descriptive research, for the period of 2009-2011, from the provincial health center, with the questionnaires were excluded patients with a family history of disease and the spatial diagram for disease prevalence was drawn according to the prepared data. The arsenic geographical distribution diagram in Isfahan province was also prepared and then the relation between an element of Arsenic in the province and the Dyspnea diseases were analyzed. Results: The analyses showed that the highest rate of Arsenic is entered the soil via fertilizers to come eventually into the food cycle of humans. By analyzing the amount of used fertilizers in Isfahan province and the dispersion diagram of Arsenic in Isfahan province, it was found that the highest frequency of Arsenic is in places having agricultural base. The spatial dispersion of Dyspnea diseases also showed that the spreading of Dyspnea diseases is greater in places with higher scale of Arsenic. Conclusions: This study is a logical justification between the two diagrams to confirm the hypothesis regarding the effect of arsenic on Dyspnea. PMID:25538832

  9. The Sizing and Optimization Language, (SOL): Computer language for design problems

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1988-01-01

    The Sizing and Optimization Language, (SOL), a new high level, special purpose computer language was developed to expedite application of numerical optimization to design problems and to make the process less error prone. SOL utilizes the ADS optimization software and provides a clear, concise syntax for describing an optimization problem, the OPTIMIZE description, which closely parallels the mathematical description of the problem. SOL offers language statements which can be used to model a design mathematically, with subroutines or code logic, and with existing FORTRAN routines. In addition, SOL provides error checking and clear output of the optimization results. Because of these language features, SOL is best suited to model and optimize a design concept when the model consits of mathematical expressions written in SOL. For such cases, SOL's unique syntax and error checking can be fully utilized. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler, runtime library routines, and a SOL reference manual.

  10. Consultative Committee for Space Data Systems recommendation for space data system standards: Telecommand. Part 2.1: Command operation procedures

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This recommendation contains the detailed specification of the logic required to carry out the Command Operations Procedures of the Transfer Layer. The Recommendation for Telecommand--Part 2, Data Routing Service contains the standard data structures and data communication procedures used by the intermediate telecommand system layers (the Transfer and Segmentation Layers). In particular, it contains a brief description of the Command Operations Procedures (COP) within the Transfer Layer. This recommendation contains the detailed definition of the COP's in the form of state tables, along with definitions of the terms used. It is assumed that the reader of this document is familiar with the data structures and terminology of part 2. In case of conflict between the description of the COP's in part 2 and in this recommendation, the definition in this recommendation will take precedence. In particular, this document supersedes section 4.3.3.1 through 4.3.3.4 of part 2.

  11. Epidemiology as discourse: the politics of development institutions in the Epidemiological Profile of El Salvador

    PubMed Central

    Aviles, L

    2001-01-01

    STUDY OBJECTIVE—To determine the ways in which institutions devoted to international development influence epidemiological studies.
DESIGN—This article takes a descriptive epidemiological study of El Salvador, Epidemiological Profile, conducted in 1994 by the US Agency for International Development, as a case study. The methods include discourse analysis in order to uncover the ideological basis of the report and its characteristics as a discourse of development.
SETTING—El Salvador.
RESULTS—The Epidemiological Profile theoretical basis, the epidemiological transition theory, embodies the ethnocentrism of a "colonizer's model of the world." This report follows the logic of a discourse of development by depoliticising development, creating abnormalities, and relying on the development consulting industry. The epidemiological transition theory serves as an ideology that legitimises and dissimulates the international order.
CONCLUSIONS—Even descriptive epidemiological assessments or epidemiological profiles are imbued with theoretical assumptions shaped by the institutional setting under which epidemiological investigations are conducted.


Keywords: El Salvador; politics PMID:11160170

  12. A mixture gatekeeping procedure based on the Hommel test for clinical trial applications.

    PubMed

    Brechenmacher, Thomas; Xu, Jane; Dmitrienko, Alex; Tamhane, Ajit C

    2011-07-01

    When conducting clinical trials with hierarchically ordered objectives, it is essential to use multiplicity adjustment methods that control the familywise error rate in the strong sense while taking into account the logical relations among the null hypotheses. This paper proposes a gatekeeping procedure based on the Hommel (1988) test, which offers power advantages compared to other p value-based tests proposed in the literature. A general description of the procedure is given and details are presented on how it can be applied to complex clinical trial designs. Two clinical trial examples are given to illustrate the methodology developed in the paper.

  13. Instrumentation of Java Bytecode for Runtime Analysis

    NASA Technical Reports Server (NTRS)

    Goldberg, Allen; Haveland, Klaus

    2003-01-01

    This paper describes JSpy, a system for high-level instrumentation of Java bytecode and its use with JPaX, OUT system for runtime analysis of Java programs. JPaX monitors the execution of temporal logic formulas and performs predicative analysis of deadlocks and data races. JSpy s input is an instrumentation specification, which consists of a collection of rules, where a rule is a predicate/action pair The predicate is a conjunction of syntactic constraints on a Java statement, and the action is a description of logging information to be inserted in the bytecode corresponding to the statement. JSpy is built using JTrek an instrumentation package at a lower level of abstraction.

  14. The conical scanner evaluation system design

    NASA Technical Reports Server (NTRS)

    Cumella, K. E.; Bilanow, S.; Kulikov, I. B.

    1982-01-01

    The software design for the conical scanner evaluation system is presented. The purpose of this system is to support the performance analysis of the LANDSAT-D conical scanners, which are infrared horizon detection attitude sensors designed for improved accuracy. The system consists of six functionally independent subsystems and five interface data bases. The system structure and interfaces of each of the subsystems is described and the content, format, and file structure of each of the data bases is specified. For each subsystem, the functional logic, the control parameters, the baseline structure, and each of the subroutines are described. The subroutine descriptions include a procedure definition and the input and output parameters.

  15. Mission planning, mission analysis and software formulation. Level C requirements for the shuttle mission control center orbital guidance software

    NASA Technical Reports Server (NTRS)

    Langston, L. J.

    1976-01-01

    The formulation of Level C requirements for guidance software was reported. Requirements for a PEG supervisor which controls all input/output interfaces with other processors and determines which PEG mode is to be utilized were studied in detail. A description of the two guidance modes for which Level C requirements have been formulated was presented. Functions required for proper execution of the guidance software were defined. The requirements for a navigation function that is used in the prediction logic of PEG mode 4 were discussed. It is concluded that this function is extracted from the current navigation FSSR.

  16. A formal language for the specification and verification of synchronous and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    Russinoff, David M.

    1993-01-01

    A formal hardware description language for the intended application of verifiable asynchronous communication is described. The language is developed within the logical framework of the Nqthm system of Boyer and Moore and is based on the event-driven behavioral model of VHDL, including the basic VHDL signal propagation mechanisms, the notion of simulation deltas, and the VHDL simulation cycle. A core subset of the language corresponds closely with a subset of VHDL and is adequate for the realistic gate-level modeling of both combinational and sequential circuits. Various extensions to this subset provide means for convenient expression of behavioral circuit specifications.

  17. Autonomous Dome for a Robotic Telescope

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Sengupta, A.; Ganesh, S.

    2016-12-01

    The Physical Research Laboratory operates a 50 cm robotic observatory at Mount Abu (Rajsthan, India). This Automated Telescope for Variability Studies (ATVS) makes use of the Remote Telescope System 2 (RTS2) for autonomous operations. The observatory uses a 3.5 m dome from Sirius Observatories. We have developed electronics using Arduino electronic circuit boards with home grown logic and software to control the dome operations. We are in the process of completing the drivers to link our Arduino based dome controller with RTS2. This document is a short description of the various phases of the development and their integration to achieve the required objective.

  18. Eleven fetal echocardiographic planes using 4-dimensional ultrasound with spatio-temporal image correlation (STIC): a logical approach to fetal heart volume analysis.

    PubMed

    Jantarasaengaram, Surasak; Vairojanavong, Kittipong

    2010-09-15

    Theoretically, a cross-sectional image of any cardiac planes can be obtained from a STIC fetal heart volume dataset. We described a method to display 11 fetal echocardiographic planes from STIC volumes. Fetal heart volume datasets were acquired by transverse acquisition from 200 normal fetuses at 15 to 40 weeks of gestation. Analysis of the volume datasets using the described technique to display 11 echocardiographic planes in the multiplanar display mode were performed offline. Volume datasets from 18 fetuses were excluded due to poor image resolution. The mean visualization rates for all echocardiographic planes at 15-17, 18-22, 23-27, 28-32 and 33-40 weeks of gestation fetuses were 85.6% (range 45.2-96.8%, N = 31), 92.9% (range 64.0-100%, N = 64), 93.4% (range 51.4-100%, N = 37), 88.7%(range 54.5-100%, N = 33) and 81.8% (range 23.5-100%, N = 17) respectively. Overall, the applied technique can favorably display the pertinent echocardiographic planes. Description of the presented method provides a logical approach to explore the fetal heart volumes.

  19. Minimum energy dissipation required for a logically irreversible operation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Yoshikawa, Nobuyuki

    2018-01-01

    According to Landauer's principle, the minimum heat emission required for computing is linked to logical entropy, or logical reversibility. The validity of Landauer's principle has been investigated for several decades and was finally demonstrated in recent experiments by showing that the minimum heat emission is associated with the reduction in logical entropy during a logically irreversible operation. Although the relationship between minimum heat emission and logical reversibility is being revealed, it is not clear how much free energy is required to be dissipated for a logically irreversible operation. In the present study, in order to reveal the connection between logical reversibility and free energy dissipation, we numerically demonstrated logically irreversible protocols using adiabatic superconductor logic. The calculation results of work during the protocol showed that, while the minimum heat emission conforms to Landauer's principle, the free energy dissipation can be arbitrarily reduced by performing the protocol quasistatically. The above results show that logical reversibility is not associated with thermodynamic reversibility, and that heat is not only emitted from logic devices but also absorbed by logic devices. We also formulated the heat emission from adiabatic superconductor logic during a logically irreversible operation at a finite operation speed.

  20. Logical optimization for database uniformization

    NASA Technical Reports Server (NTRS)

    Grant, J.

    1984-01-01

    Data base uniformization refers to the building of a common user interface facility to support uniform access to any or all of a collection of distributed heterogeneous data bases. Such a system should enable a user, situated anywhere along a set of distributed data bases, to access all of the information in the data bases without having to learn the various data manipulation languages. Furthermore, such a system should leave intact the component data bases, and in particular, their already existing software. A survey of various aspects of the data bases uniformization problem and a proposed solution are presented.

Top